Reducing multiclass to binary: a unifying approach for margin classifiers
The Journal of Machine Learning Research
In Defense of One-Vs-All Classification
The Journal of Machine Learning Research
Hierarchical document categorization with support vector machines
Proceedings of the thirteenth ACM international conference on Information and knowledge management
An incremental node embedding technique for error correcting output codes
Pattern Recognition
Extracting and composing robust features with denoising autoencoders
Proceedings of the 25th international conference on Machine learning
Refined experts: improving classification in large taxonomies
Proceedings of the 32nd international ACM SIGIR conference on Research and development in information retrieval
Zero-data learning of new tasks
AAAI'08 Proceedings of the 23rd national conference on Artificial intelligence - Volume 2
Solving multiclass learning problems via error-correcting output codes
Journal of Artificial Intelligence Research
Conditional probability tree estimation analysis and algorithms
UAI '09 Proceedings of the Twenty-Fifth Conference on Uncertainty in Artificial Intelligence
The ECIR 2010 large scale hierarchical classification workshop
ACM SIGIR Forum
Hi-index | 0.00 |
We describe a new approach for classification with a very large number of classes where we assume some class similarity information is available, e.g. through a hierarchical organization. The proposed method learns a compact binary code using such an existing similarity information defined on classes. Binary classifiers are then trained using this code and decoding is performed using a simple nearest neighbor rule. This strategy, related to Error Correcting Output Codes methods, is shown to perform similarly or better than the standard and efficient one-vs-all approach, with much lower inference complexity.