The Strength of Weak Learnability
Machine Learning
Data mining: practical machine learning tools and techniques with Java implementations
Data mining: practical machine learning tools and techniques with Java implementations
Neural Networks: A Comprehensive Foundation
Neural Networks: A Comprehensive Foundation
Expert Systems: Principles and Programming
Expert Systems: Principles and Programming
FERNN: An Algorithm for Fast Extraction of Rules fromNeural Networks
Applied Intelligence
Feedforward Neural Network Construction Using Cross Validation
Neural Computation
Hierarchical Rules for a Hierarchical Classifier
ICANNGA '07 Proceedings of the 8th international conference on Adaptive and Natural Computing Algorithms, Part I
Partial Discriminative Training of Neural Networks for Classification of Overlapping Classes
ANNPR '08 Proceedings of the 3rd IAPR workshop on Artificial Neural Networks in Pattern Recognition
A hierarchical classifier with growing neural gas clustering
ICANNGA'09 Proceedings of the 9th international conference on Adaptive and natural computing algorithms
Expert Systems with Applications: An International Journal
Risk estimation for hierarchical classifier
HAIS'11 Proceedings of the 6th international conference on Hybrid artificial intelligent systems - Volume Part I
Risk function estimation for subproblems in a hierarchical classifier
Pattern Recognition Letters
New specifics for a hierarchial estimator meta-algorithm
ICAISC'12 Proceedings of the 11th international conference on Artificial Intelligence and Soft Computing - Volume Part II
Application of hierarchical classifier to minimal synchronizing word problem
ICAISC'12 Proceedings of the 11th international conference on Artificial Intelligence and Soft Computing - Volume Part I
Hi-index | 12.05 |
In this paper a novel complex classifier architecture is proposed. The architecture has a hierarchical tree-like structure with simple artificial neural networks (ANNs) at each node. The actual structure for a given problem is not preset but is built throughout training. The training algorithm's ability to build the tree-like structure is based on the assumption that when a weak classifier (i.e., one that classifies only slightly better than a random classifier) is trained and examples from any two output classes are frequently mismatched, then they must carry similar information and constitute a sub-problem. After each ANN has been trained its incorrect classifications are analyzed and new sub-problems are formed. Consequently, new ANNs are built for each of these sub-problems and form another layer of the hierarchical classifier. An important feature of the hierarchical classifier proposed in this work is that the problem partition forms overlapping sub-problems. Thus, the classification follows not just a single path from the root, but may fork enhancing the power of the classification. It is shown how to combine the results of these individual classifiers.