Original Contribution: Stacked generalization
Neural Networks
The weighted majority algorithm
Information and Computation
A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
IEEE Transactions on Pattern Analysis and Machine Intelligence
Sensor and Data Fusion Concepts and Applications
Sensor and Data Fusion Concepts and Applications
Learn++: an incremental learning algorithm for supervised neuralnetworks
IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews
Ensemble confidence estimates posterior probability
MCS'05 Proceedings of the 6th international conference on Multiple Classifier Systems
Hi-index | 0.00 |
We have recently introduced Learn++ as an incremental learning algorithm capable of learning additional data that may later become available. The strength of Learn++ lies with its ability to learn new data without forgetting previously acquired knowledge and without requiring access to any of the previously seen data, even when the new data introduce new classes. Learn++, inspired in part by AdaBoost, achieves incremental learning through generating an ensemble of classifiers for each new dataset that becomes available and then combining them through weighted majority voting with a distribution update rule modified for incremental learning of new classes. We have recently discovered that Learn++ also provides a practical and a general purpose approach for multisensor and/or multimodality data fusion. In this paper, we present Learn++ as an addition to the new breed of classifier fusion algorithms, along with preliminary results obtained on two real-world data fusion applications.