A Nearest Hyperrectangle Learning Method
Machine Learning
A hybrid nearest-neighbor and nearest-hyperrectangle algorithm
ECML-94 Proceedings of the European conference on machine learning on Machine Learning
Lazy learning
Statistical Pattern Recognition: A Review
IEEE Transactions on Pattern Analysis and Machine Intelligence
Machine Learning
Optimizing a Multiple Classifier System
PRICAI '02 Proceedings of the 7th Pacific Rim International Conference on Artificial Intelligence: Trends in Artificial Intelligence
Combining Pattern Classifiers: Methods and Algorithms
Combining Pattern Classifiers: Methods and Algorithms
Proposition of common classifier construction for pattern recognition with context task
Knowledge-Based Systems
Bayesian analysis of linear combiners
MCS'07 Proceedings of the 7th international conference on Multiple classifier systems
Hi-index | 0.00 |
Traditional approaches to a combining classifier produce committees on the basis of the outputs of simple classifiers. There are two main concepts, the first one uses class number given by simple classifiers and the second one based on their discriminants. However classifier fusion needn't have been done as a mix of outputs but it could be performed as the fusion of training information used by the classifiers to make decision. Let us note that two general forms of information are used by classifiers: learning sets and rules. Paper presents concept of information fusion where mentioned type of information are used together during NGE (Nested Generalized Exemplar) learning. The NGE is a learning method that generalizes a given training set into a set of hyperrectangles in an n-dimensional Euclidean space. The NGE algorithm can be considered a descendent of minimal distance classifiers, known as lazy classifier like k-NN (k-Nearest Neighbor) classifier. For new examples, the class of the smallest hyperrectangle that contains the new example is predicted. If the new example is not within any hyperrectangle, the algorithm predicts the class of the closest generalized exemplar by calculating their Eucledian distance to the nearest generalized hyperrectangle. This paper describes a version of the NGE model which produces a set of hyperrectangles on the basis of a training set and a set of rules. Quality of proposed modification of NGE (called RB-NGE) has been evaluated and compared to original version of NGE and k-NN during computer experiments.