Robust regression computation computation using iteratively reweighted least squares
SIAM Journal on Matrix Analysis and Applications
The nature of statistical learning theory
The nature of statistical learning theory
Boosting a weak learning algorithm by majority
Information and Computation
A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
Making large-scale support vector machine learning practical
Advances in kernel methods
Optimizing classifiers for imbalanced training sets
Proceedings of the 1998 conference on Advances in neural information processing systems II
Improved Generalization Through Explicit Optimization of Margins
Machine Learning
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Machine Learning
Neural Networks: Tricks of the Trade, this book is an outgrowth of a 1996 NIPS workshop
Adaptive Blind Signal and Image Processing: Learning Algorithms and Applications
Adaptive Blind Signal and Image Processing: Learning Algorithms and Applications
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
Neural Networks: A Comprehensive Foundation (3rd Edition)
Neural Networks: A Comprehensive Foundation (3rd Edition)
Statistical Comparisons of Classifiers over Multiple Data Sets
The Journal of Machine Learning Research
An introduction to kernel-based learning algorithms
IEEE Transactions on Neural Networks
Posterior probability support vector Machines for unbalanced data
IEEE Transactions on Neural Networks
A Kernel-Based Two-Class Classifier for Imbalanced Data Sets
IEEE Transactions on Neural Networks
Weighted Mahalanobis Distance Kernels for Support Vector Machines
IEEE Transactions on Neural Networks
Multiclass Posterior Probability Support Vector Machines
IEEE Transactions on Neural Networks
Contourlet detection and feature extraction for automatic target recognition
SMC'09 Proceedings of the 2009 IEEE international conference on Systems, Man and Cybernetics
An information theoretic sparse kernel algorithm for online learning
Expert Systems with Applications: An International Journal
Hi-index | 0.00 |
The purpose of this research is to develop a classifier capable of state-of-the-art performance in both computational efficiency and generalization ability while allowing the algorithm designer to choose arbitrary loss functions as appropriate for a give problem domain. This is critical in applications involving heavily imbalanced, noisy, or non-Gaussian distributed data. To achieve this goal, a kernel-matching pursuit (KMP) framework is formulated where the objective is margin maximization rather than the standard error minimization. This approach enables excellent performance and computational savings in the presence of large, imbalanced training data sets and facilitates the development of two general algorithms. These algorithms support the use of arbitrary loss functions allowing the algorithm designer to control the degree to which outliers are penalized and the manner in which non-Gaussian distributed data is handled. Example loss functions are provided and algorithm performance is illustrated in two groups of experimental results. The first group demonstrates that the proposed algorithms perform equivalent to several state-of-the-art machine learning algorithms on well-published, balanced data. The second group of results illustrates superior performance by the proposed algorithms on imbalanced, non-Gaussian data achieved by employing loss functions appropriate for the data characteristics and problem domain.