Self-organization and associative memory: 3rd edition
Self-organization and associative memory: 3rd edition
A Nearest Hyperrectangle Learning Method
Machine Learning
Learning Ensembles from Bites: A Scalable and Accurate Approach
The Journal of Machine Learning Research
Boosting Nearest Neighbor Classi.ers for Multiclass Recognition
CVPR '05 Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05) - Workshops - Volume 03
An Adaptive Classification Algorithm Using Robust Incremental Clustering
ICPR '06 Proceedings of the 18th International Conference on Pattern Recognition - Volume 01
SoftDoubleMinOver: a simple procedure for maximum margin classification
ICANN'05 Proceedings of the 15th international conference on Artificial neural networks: formal models and their applications - Volume Part II
Rapid online learning of objects in a biologically motivated recognition architecture
PR'05 Proceedings of the 27th DAGM conference on Pattern Recognition
Hi-index | 0.00 |
In this paper we propose the Local Credibility Concept (LCC), a novel technique for incremental classifiers. It measures the classification rate of the classifier's local models and ensures that the models do not cross the borders between classes, but allows them to develop freely within the domain of their own class. Thus, we reduce the dependency on the order of training samples, an inherent problem of incremental methods, and make the classifier robust w.r.t. selecting the algorithm's parameters. These only influence the number of models, whereas the performance is controlled by the LCC automatically on a local scale. In contrast to other algorithms, the models of our method are more adaptable as they can also shrink and vanish. This allows classes to move their domains in the data space making the LCC-Classifier also applicable to drifting data concepts. We present experiments to demonstrate these capabilities as well as some benchmark tests that show the algorithm's competitive performance.