Floating search methods in feature selection
Pattern Recognition Letters
Machine Learning
A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
IEEE Transactions on Pattern Analysis and Machine Intelligence
The Random Subspace Method for Constructing Decision Forests
IEEE Transactions on Pattern Analysis and Machine Intelligence
Machine Learning
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
Rotation Forest: A New Classifier Ensemble Method
IEEE Transactions on Pattern Analysis and Machine Intelligence
Cluster-based pattern discrimination: A novel technique for feature selection
Pattern Recognition Letters
Rapid and brief communication: FuzzyBagging: A novel ensemble of classifiers
Pattern Recognition
A hybrid wavelet-based fingerprint matcher
Pattern Recognition
Expert Systems with Applications: An International Journal
Switching class labels to generate classification ensembles
Pattern Recognition
Input Decimated Ensemble based on Neighborhood Preserving Embedding for spectrogram classification
Expert Systems with Applications: An International Journal
Creating ensembles of classifiers via fuzzy clustering and deflection
Fuzzy Sets and Systems
Reduced Reward-punishment editing for building ensembles of classifiers
Expert Systems with Applications: An International Journal
Greedy optimization classifiers ensemble based on diversity
Pattern Recognition
IEEE/ACM Transactions on Computational Biology and Bioinformatics (TCBB)
A classifier ensemble approach for the missing feature problem
Artificial Intelligence in Medicine
Hi-index | 12.05 |
In this paper, we have made an extensive study of artificial intelligence (AI) techniques like ensemble of classifiers and feature selection for the identification of students with learning disabilities. The experimental results show that our best method, which combines both ensemble of classifiers and feature selection, can correctly identify up to 50% of the learning disabilities (LD) students with 100% confidence. Also when predicting samples in ''junior high school'' using model built on the ''elementary school'' students and when the ''junior high school'' samples are used to build the model we predict the samples in the ''elementary school'' dataset. In particular, we propose variants of two recent Feature Transform-based ensemble methods (Rotation Forest and Input Decimated Ensemble). In the Rotation Forest, the feature set is randomly split into subsets and Principal Component Analysis (PCA) is used to transform the features that belong to a subset. The Input Decimated Ensemble first singles out a given class i and runs PCA on this data only. This transformation is applied to the whole dataset and a classifier D"i is trained using these transformed patterns. This choice limits the size of the ensemble to the number of classes. In this paper, we perform an empirical comparison varying the Feature Transform method used in the Rotation Forest technique and we propose a clustering method to overcome the drawback of the Input Decimated Ensemble.