The Strength of Weak Learnability
Machine Learning
A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
Principles of data mining
FERNN: An Algorithm for Fast Extraction of Rules fromNeural Networks
Applied Intelligence
Multiclass Boosting for Weak Classifiers
The Journal of Machine Learning Research
Feedforward Neural Network Construction Using Cross Validation
Neural Computation
Hierarchical classifier with overlapping class groups
Expert Systems with Applications: An International Journal
PPAM'05 Proceedings of the 6th international conference on Parallel Processing and Applied Mathematics
Risk function estimation for subproblems in a hierarchical classifier
Pattern Recognition Letters
Application of hierarchical classifier to minimal synchronizing word problem
ICAISC'12 Proceedings of the 11th international conference on Artificial Intelligence and Soft Computing - Volume Part I
ART-based fusion of multi-modal perception for robots
Neurocomputing
Hi-index | 0.00 |
A novel architecture for a hierarchical classifier (HC) is defined. The objective is to combine several weak classifiers to form a strong one, but a different approach from those known, e.g. AdaBoost, is taken: the training set is split on the basis of previous classifier misclassification between output classes. The problem is split into overlapping subproblems, each classifying into a different set of output classes. This allows for a task size reduction as each sub-problem is smaller in the sense of lower number of output classes, and for higher accuracy. The HC proposes a different approach to the boosting approach. The groups of output classes overlap, thus examples from a single class may end up in several subproblems. It is shown, that this approach ensures that such hierarchical classifier achieves better accuracy. A notion of generalized accuracy is introduced. The sub-problems generation is simple as it is performed with a clustering algorithm operating on classifier outputs. We propose to use the Growing Neural Gas [1] algorithm, because of its good adaptiveness.