The nature of statistical learning theory
The nature of statistical learning theory
A Tutorial on Support Vector Machines for Pattern Recognition
Data Mining and Knowledge Discovery
Estimating the Generalization Performance of an SVM Efficiently
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
Incorporating prior knowledge with weighted margin support vector machines
Proceedings of the tenth ACM SIGKDD international conference on Knowledge discovery and data mining
Fast SVM Training Algorithm with Decomposition on Very Large Data Sets
IEEE Transactions on Pattern Analysis and Machine Intelligence
Midpoint-Validation Method for Support Vector Machine Classification
IEICE - Transactions on Information and Systems
Rapid and brief communiction: Possibilistic support vector machines
Pattern Recognition
Sparse ensembles using weighted combination methods based on linear programming
Pattern Recognition
Hidden space principal component analysis
PAKDD'06 Proceedings of the 10th Pacific-Asia conference on Advances in Knowledge Discovery and Data Mining
A two-distribution compounded statistical model for Radar HRRP target recognition
IEEE Transactions on Signal Processing - Part I
A comparison of methods for multiclass support vector machines
IEEE Transactions on Neural Networks
Hidden space support vector machines
IEEE Transactions on Neural Networks
Face Recognition Using Total Margin-Based Adaptive Fuzzy Support Vector Machines
IEEE Transactions on Neural Networks
A new method for expert target recognition system: Genetic wavelet extreme learning machine (GAWELM)
Expert Systems with Applications: An International Journal
A fast algorithm for kernel 1-norm support vector machines
Knowledge-Based Systems
Hi-index | 0.01 |
This paper proposes a new classifier called density-induced margin support vector machines (DMSVMs). DMSVMs belong to a family of SVM-like classifiers. Thus, DMSVMs inherit good properties from support vector machines (SVMs), e.g., unique and global solution, and sparse representation for the decision function. For a given data set, DMSVMs require to extract relative density degrees for all training data points. These density degrees can be taken as relative margins of corresponding training data points. Moreover, we propose a method for estimating relative density degrees by using the K nearest neighbor method. We also show the upper bound on the leave-out-one error of DMSVMs for a binary classification problem and prove it. Promising results are obtained on toy as well as real-world data sets.