Global optimization
Adaptive algorithms and stochastic approximations
Adaptive algorithms and stochastic approximations
Machine Learning
On-line learning and stochastic approximations
On-line learning in neural networks
Robust Classification for Imprecise Environments
Machine Learning
Adaptive Control
The Relaxed Online Maximum Margin Algorithm
Machine Learning
Introduction to Stochastic Search and Optimization
Introduction to Stochastic Search and Optimization
A new approximate maximal margin classification algorithm
The Journal of Machine Learning Research
An efficient boosting algorithm for combining preferences
The Journal of Machine Learning Research
Benchmarking a Reduced Multivariate Polynomial Pattern Classifier
IEEE Transactions on Pattern Analysis and Machine Intelligence
Optimising area under the ROC curve using gradient descent
ICML '04 Proceedings of the twenty-first international conference on Machine learning
An introduction to ROC analysis
Pattern Recognition Letters - Special issue: ROC analysis in pattern recognition
Nonparametric analysis of fingerprint data on large data sets
Pattern Recognition
Statistical Comparisons of Classifiers over Multiple Data Sets
The Journal of Machine Learning Research
Online Passive-Aggressive Algorithms
The Journal of Machine Learning Research
Maximizing the area under the ROC curve by pairwise feature combination
Pattern Recognition
Confidence-weighted linear classification
Proceedings of the 25th international conference on Machine learning
Efficient Multiclass ROC Approximation by Decomposition via Confusion Matrix Perturbation Analysis
IEEE Transactions on Pattern Analysis and Machine Intelligence
Maximizing area under ROC curve for biometric scores fusion
Pattern Recognition
Extended kernel recursive least squares algorithm
IEEE Transactions on Signal Processing
SGD-QN: Careful Quasi-Newton Stochastic Gradient Descent
The Journal of Machine Learning Research
Partial AUC maximization in a linear combination of dichotomizers
Pattern Recognition
Adaptive ROC-based ensembles of HMMs applied to anomaly detection
Pattern Recognition
Leaky LMS algorithm: MSE analysis for Gaussian data
IEEE Transactions on Signal Processing
IEEE Transactions on Signal Processing
The kernel recursive least-squares algorithm
IEEE Transactions on Signal Processing
A variable step size LMS algorithm
IEEE Transactions on Signal Processing
The Kernel Least-Mean-Square Algorithm
IEEE Transactions on Signal Processing
The least mean fourth (LMF) adaptive algorithm and its family
IEEE Transactions on Information Theory
Adaptive techniques for signal processing in communications
IEEE Communications Magazine
Perceptron-based learning algorithms
IEEE Transactions on Neural Networks
Technology classification with latent semantic indexing
Expert Systems with Applications: An International Journal
Protecting research and technology from espionage
Expert Systems with Applications: An International Journal
Pattern Recognition
Neural networks letter: Comments on the "No-Prop" algorithm
Neural Networks
Hi-index | 0.01 |
The area under the ROC curve (AUC) provides a good scalar measure of ranking performance without requiring a specific threshold for performance comparison among classifiers. AUC is useful for imprecise environments since it operates independently with respect to class distributions and misclassification costs. A direct optimization of this AUC criterion thus becomes a natural choice for binary classifier design. However, a direct formulation based on the AUC criterion would require a high computational cost due to the drastically increasing input pair features. In this paper, we propose an online learning algorithm to circumvent this computational problem for binary classification. Different from those conventional recursive formulations, the proposed formulation involves a pairwise cost function which pairs up a newly arrived data point with those of opposite class in stored data. Moreover, with incorporation of a sparse learning into the online formulation, the computational effort can be significantly reduced. Our empirical results on three different scales of public databases show promising potential in terms of classification AUC, accuracy, and computational efficiency.