Neural networks: a systematic introduction
Neural networks: a systematic introduction
Expected classification error of the Fisher linear classifier with pseudo-inverse covariance matrix
Pattern Recognition Letters
The Relaxed Online Maximum Margin Algorithm
Machine Learning
Signal Processing - Special issue: Genomic signal processing
A new approximate maximal margin classification algorithm
The Journal of Machine Learning Research
LESS: A Model-Based Classifier for Sparse Subspaces
IEEE Transactions on Pattern Analysis and Machine Intelligence
Online Passive-Aggressive Algorithms
The Journal of Machine Learning Research
Using multiple acoustic feature sets for speech recognition
Speech Communication
Face recognition using LDA-based algorithms
IEEE Transactions on Neural Networks
Robustness of a CAD system on digitized mammograms
ANNPR'12 Proceedings of the 5th INNS IAPR TC 3 GIRPR conference on Artificial Neural Networks in Pattern Recognition
Unlabeling data can improve classification accuracy
Pattern Recognition Letters
Hi-index | 0.00 |
In this study we address the linear classification of noisy high-dimensional data in a two class scenario. We assume that the cardinality of the data is much lower than its dimensionality. The problem of classification in this setting is intensified in the presence of noise. Eleven linear classifiers were compared on two-thousand-one-hundred-and-fifty artificial datasets from four different experimental setups, and five real world gene expression profile datasets, in terms of classification accuracy and robustness. We specifically focus on linear classifiers as the use of more complex concept classes would make over-adaptation even more likely. Classification accuracy is measured by mean error rate and mean rank of error rate. These criteria place two large margin classifiers, SVM and ALMA, and an online classification algorithm called PA at the top, with PA being statistically different from SVM on the artificial data. Surprisingly, these algorithms also outperformed statistically significant all classifiers investigated with dimensionality reduction.