Floating search methods in feature selection
Pattern Recognition Letters
A re-examination of text categorization methods
Proceedings of the 22nd annual international ACM SIGIR conference on Research and development in information retrieval
Machine learning in automated text categorization
ACM Computing Surveys (CSUR)
Some Theoretical Aspects of Boosting in the Presence of Noisy Data
ICML '01 Proceedings of the Eighteenth International Conference on Machine Learning
A decision-theoretic generalization of on-line learning and an application to boosting
EuroCOLT '95 Proceedings of the Second European Conference on Computational Learning Theory
FloatBoost Learning and Statistical Face Detection
IEEE Transactions on Pattern Analysis and Machine Intelligence
S-adaboost and pattern detection in complex environment
CVPR'03 Proceedings of the 2003 IEEE computer society conference on Computer vision and pattern recognition
Hi-index | 0.00 |
Boosting is a method for supervised learning, which has successfully been applied to many different domains and has proven one of the best performers in text classification exercises so far. FloatBoost learning uses a backtrack mechanism after each iteration of AdaBoost learning to minimize the error rate directly, rather than minimizing an exponential function of the margin as in the traditional AdaBoost algorithm. This paper presents an improved FloatBoost boosting algorithm for boosting Naïve Bayes text classification, called DifBoost, which combines Divide and Conquer Principal with the FloatBoost algorithm. Integrating FloatBoost with the Divide and Conquer principal, DifBoost divides the input space into a few sub-spaces during training process and the final classifier is formed with the weighted combination of basic classifiers, where basic classifiers are affected by different sub-spaces differently. Extensive experiments using benchmarks are conducted and the encouraging results show the effectiveness of our proposed algorithm.