Machine Learning
A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
Improved Boosting Algorithms Using Confidence-rated Predictions
Machine Learning - The Eleventh Annual Conference on computational Learning Theory
Machine Learning
Data Mining and Knowledge Discovery
Text Categorization with Suport Vector Machines: Learning with Many Relevant Features
ECML '98 Proceedings of the 10th European Conference on Machine Learning
Exploiting the Cost (In)sensitivity of Decision Tree Splitting Criteria
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Using Diversity with Three Variants of Boosting: Aggressive, Conservative, and Inverse
MCS '02 Proceedings of the Third International Workshop on Multiple Classifier Systems
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
Support vector machine active learning with applications to text classification
The Journal of Machine Learning Research
Editorial: special issue on learning from imbalanced data sets
ACM SIGKDD Explorations Newsletter - Special issue on learning from imbalanced datasets
Learning from imbalanced data sets with boosting and data generation: the DataBoost-IM approach
ACM SIGKDD Explorations Newsletter - Special issue on learning from imbalanced datasets
A multistrategy approach for digital text categorization from imbalanced documents
ACM SIGKDD Explorations Newsletter - Special issue on learning from imbalanced datasets
Bias-Variance Analysis of Support Vector Machines for the Development of SVM-Based Ensemble Methods
The Journal of Machine Learning Research
KBA: Kernel Boundary Alignment Considering Imbalanced Data Distribution
IEEE Transactions on Knowledge and Data Engineering
Neural Computation
Generalized Discriminant Analysis Using a Kernel Approach
Neural Computation
SMOTE: synthetic minority over-sampling technique
Journal of Artificial Intelligence Research
Comparing support vector machines with Gaussian kernels to radialbasis function classifiers
IEEE Transactions on Signal Processing
Semantic analysis of real-world images using support vector machine
Expert Systems with Applications: An International Journal
On selection and combination of weak learners in AdaBoost
Pattern Recognition Letters
Boosting a multi-linear classifier with application to visual lip reading
Expert Systems with Applications: An International Journal
CIARP'10 Proceedings of the 15th Iberoamerican congress conference on Progress in pattern recognition, image analysis, computer vision, and applications
Adaboost with SVM-based classifier for the classification of brain motor imagery tasks
UAHCI'11 Proceedings of the 6th international conference on Universal access in human-computer interaction: users diversity - Volume Part II
A multiple classifier system for classification of LIDAR remote sensing data using multi-class SVM
MCS'10 Proceedings of the 9th international conference on Multiple Classifier Systems
Incremental face recognition for large-scale social network services
Pattern Recognition
A self-constructing cascade classifier with AdaBoost and SVM for pedestriandetection
Engineering Applications of Artificial Intelligence
Questions about questions: an empirical analysis of information needs on Twitter
Proceedings of the 22nd international conference on World Wide Web
Smoothed emphasis for boosting ensembles
IWANN'13 Proceedings of the 12th international conference on Artificial Neural Networks: advances in computational intelligence - Volume Part I
UIT-ANPR: toward an open framework for automatic number plate recognition on smartphones
Proceedings of the 8th International Conference on Ubiquitous Information Management and Communication
SR-NBS: A fast sparse representation based N-best class selector for robust phoneme classification
Engineering Applications of Artificial Intelligence
Beyond cross-domain learning: Multiple-domain nonnegative matrix factorization
Engineering Applications of Artificial Intelligence
Hi-index | 0.00 |
The use of SVM (Support Vector Machine) as component classifier in AdaBoost may seem like going against the grain of the Boosting principle since SVM is not an easy classifier to train. Moreover, Wickramaratna et al. [2001. Performance degradation in boosting. In: Proceedings of the Second International Workshop on Multiple Classifier Systems, pp. 11-21] show that AdaBoost with strong component classifiers is not viable. In this paper, we shall show that AdaBoost incorporating properly designed RBFSVM (SVM with the RBF kernel) component classifiers, which we call AdaBoostSVM, can perform as well as SVM. Furthermore, the proposed AdaBoostSVM demonstrates better generalization performance than SVM on imbalanced classification problems. The key idea of AdaBoostSVM is that for the sequence of trained RBFSVM component classifiers, starting with large @s values (implying weak learning), the @s values are reduced progressively as the Boosting iteration proceeds. This effectively produces a set of RBFSVM component classifiers whose model parameters are adaptively different manifesting in better generalization as compared to AdaBoost approach with SVM component classifiers using a fixed (optimal) @s value. From benchmark data sets, we show that our AdaBoostSVM approach outperforms other AdaBoost approaches using component classifiers such as Decision Trees and Neural Networks. AdaBoostSVM can be seen as a proof of concept of the idea proposed in Valentini and Dietterich [2004. Bias-variance analysis of support vector machines for the development of SVM-based ensemble methods. Journal of Machine Learning Research 5, 725-775] that Adaboost with heterogeneous SVMs could work well. Moreover, we extend AdaBoostSVM to the Diverse AdaBoostSVM to address the reported accuracy/diversity dilemma of the original Adaboost. By designing parameter adjusting strategies, the distributions of accuracy and diversity over RBFSVM component classifiers are tuned to maintain a good balance between them and promising results have been obtained on benchmark data sets.