C4.5: programs for machine learning
C4.5: programs for machine learning
A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
Data mining: practical machine learning tools and techniques with Java implementations
Data mining: practical machine learning tools and techniques with Java implementations
Ensemble learning via negative correlation
Neural Networks
Machine Learning
IEEE Transactions on Pattern Analysis and Machine Intelligence
Combining Pattern Classifiers: Methods and Algorithms
Combining Pattern Classifiers: Methods and Algorithms
Constructing diverse classifier ensembles using artificial training examples
IJCAI'03 Proceedings of the 18th international joint conference on Artificial intelligence
Expert Systems with Applications: An International Journal
A new N-gram feature extraction-selection method for malicious code
ICANNGA'11 Proceedings of the 10th international conference on Adaptive and natural computing algorithms - Volume Part II
Localizing program logical errors using extraction of knowledge from invariants
SEA'11 Proceedings of the 10th international conference on Experimental algorithms
A novel classifier ensemble method based on class weightening in huge dataset
ISNN'11 Proceedings of the 8th international conference on Advances in neural networks - Volume Part II
An innovative feature selection using fuzzy entropy
ISNN'11 Proceedings of the 8th international conference on Advances in neural networks - Volume Part III
A new clustering algorithm with the convergence proof
KES'11 Proceedings of the 15th international conference on Knowledge-based and intelligent information and engineering systems - Volume Part I
On possibility of conditional invariant detection
KES'11 Proceedings of the 15th international conference on Knowledge-based and intelligent information and engineering systems - Volume Part II
Linkage learning based on local optima
ICCCI'11 Proceedings of the Third international conference on Computational collective intelligence: technologies and applications - Volume Part I
Detection of cancer patients using an innovative method for learning at imbalanced datasets
RSKT'11 Proceedings of the 6th international conference on Rough sets and knowledge technology
A metric to evaluate a cluster by eliminating effect of complement cluster
KI'11 Proceedings of the 34th Annual German conference on Advances in artificial intelligence
Hi-index | 0.00 |
Ensemble methods like Bagging and Boosting which combine the decisions of multiple hypotheses are among the strongest existing machine learning methods. The diversity of the members of an ensemble is known to be an important factor in determining its generalization error. We present a new method for generating ensembles, named CDEBMTE (Creation of Diverse Ensemble Based on Manipulation of Training Examples), that directly constructs diverse hypotheses using manipulation of training examples in three ways: (1) sub-sampling training examples, (2) decreasing/increasing errorprone training examples and (3) decreasing/increasing neighbor samples of error-prone training examples. The technique is a simple, general meta-learner that can use any strong learner as a base classifier to build diverse committees. Experimental results using two well-known classifiers (1) decision-tree induction and (2) multilayer perceptron as two base learners demonstrate that this approach consistently achieves higher predictive accuracy than both the base classifier, Adaboost and Bagging. CDEBMTE also outperforms Adaboost more prominent when training data size is becomes larger. We propose to show that CDEBMTE can be effectively used to achieve higher accuracy and to obtain better class membership probability estimates. Experimental results using two well-known classifiers as two base learners demonstrate that this approach consistently achieves higher predictive accuracy than both the base classifier, Adaboost and Bagging. CDEBMTE also outperforms Adaboost more prominent when training data size is becomes larger.