The Strength of Weak Learnability
Machine Learning
Vector quantization and signal compression
Vector quantization and signal compression
Original Contribution: Stacked generalization
Neural Networks
The nature of statistical learning theory
The nature of statistical learning theory
Optimal combinations of pattern classifiers
Pattern Recognition Letters
Machine Learning
A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
Nonlinear component analysis as a kernel eigenvalue problem
Neural Computation
Ensemble Methods in Machine Learning
MCS '00 Proceedings of the First International Workshop on Multiple Classifier Systems
Laplacian Eigenmaps for dimensionality reduction and data representation
Neural Computation
Reducing multiclass to binary: a unifying approach for margin classifiers
The Journal of Machine Learning Research
In Defense of One-Vs-All Classification
The Journal of Machine Learning Research
Principal Manifolds and Nonlinear Dimensionality Reduction via Tangent Space Alignment
SIAM Journal on Scientific Computing
Stochastic Organization of Output Codes in Multiclass Learning Problems
Neural Computation
Generalized Discriminant Analysis Using a Kernel Approach
Neural Computation
IEEE Transactions on Pattern Analysis and Machine Intelligence
Unsupervised Learning of Image Manifolds by Semidefinite Programming
International Journal of Computer Vision
ECOC-ONE: A Novel Coding and Decoding Strategy
ICPR '06 Proceedings of the 18th International Conference on Pattern Recognition - Volume 03
Statistical Comparisons of Classifiers over Multiple Data Sets
The Journal of Machine Learning Research
Pegasos: Primal Estimated sub-GrAdient SOlver for SVM
Proceedings of the 24th international conference on Machine learning
Data-driven decomposition for multi-class classification
Pattern Recognition
A note on Platt's probabilistic outputs for support vector machines
Machine Learning
Comparing Combination Rules of Pairwise Neural Networks Classifiers
Neural Processing Letters
IEEE Transactions on Pattern Analysis and Machine Intelligence
Solving multiclass learning problems via error-correcting output codes
Journal of Artificial Intelligence Research
On the Decoding Process in Ternary Error-Correcting Output Codes
IEEE Transactions on Pattern Analysis and Machine Intelligence
Classifier combination based on confidence transformation
Pattern Recognition
Re-coding ECOCs without re-training
Pattern Recognition Letters
Error-Correcting Ouput Codes Library
The Journal of Machine Learning Research
From members to teams to committee-a robust approach to gestural and multimodal recognition
IEEE Transactions on Neural Networks
New results on error correcting output codes of kernel machines
IEEE Transactions on Neural Networks
Multi-class boosting with asymmetric binary weak-learners
Pattern Recognition
Hi-index | 0.01 |
This paper proposes a novel feature extraction method based on ensemble learning. Using the error-correcting output codes (ECOC) to design binary classifiers (dichotomizers) for separating subsets of classes, the outputs of the dichotomizers are linear or nonlinear features that provide powerful separability in a new space. In this space, the vector quantization based meta classifier can be viewed as an ECOC decoder, where each learned prototype of a class can be seen as a codeword of the class in the new representation space. We conducted extensive experiments on 16 multi-class data sets from the UCI machine learning repository. The results demonstrate the superiority of the proposed method over both existing ECOC approaches and classic feature extraction approaches. In particular, the decoding strategy using a meta classifier is shown to be more computationally efficient than the linear loss-weighted decoding in state-of-the-art ECOC methods.