A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
Choosing Multiple Parameters for Support Vector Machines
Machine Learning
Hi-index | 0.00 |
Classifiers are often data dependent as they perform better on one type of data, but fail to perform well for another data set. There is a need for robust classification algorithms which exhibit performance stability for multiple types of data. This problem can be addressed if different classifiers are fused to identify a particular class. In this paper, we have implemented the idea of classifier fusion using six different classifiers to classify the microarray gene expression data of breast cancer patients. The paper uses two classifier fusion models: majority voting and random bagging to improve the accuracy of the classifiers. Our experimental results have shown that the new proposed classifiers fusion methodology have outperforms single classification models.