A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
A Dynamic Integration Algorithm for an Ensemble of Classifiers
ISMIS '99 Proceedings of the 11th International Symposium on Foundations of Intelligent Systems
Classification of Time Series Utilizing Temporal and Decision Fusion
MCS '01 Proceedings of the Second International Workshop on Multiple Classifier Systems
Combining classifiers for face recognition
ICME '03 Proceedings of the 2003 International Conference on Multimedia and Expo - Volume 3 (ICME '03) - Volume 03
Statistical traffic classification by boosting support vector machines
Proceedings of the 7th Latin American Networking Conference
Hi-index | 0.00 |
The base classifier, which is trained by AdaBoost ensemble learning algorithm, has a constant weight for all test instances. From the view of iterative process of AdaBoost, every base classifier has good classification performance in a certain small area of input space, so the constant weight for different test samples is unreasonable. An improved AdaBoost algorithm based on adaptive weight adjusting is presented. The classifiers' selection and their weights are determined by full information behavior correlation which describes the correlation between test sample and base classifier. The method makes use of all scalars of base classifier's full information behavior, overcomes the problem of information losing. The results of simulated experiments show that the ensemble classification performance is improved greatly.