Machine Learning
Feature Selection: Evaluation, Application, and Small Sample Performance
IEEE Transactions on Pattern Analysis and Machine Intelligence
Wrappers for feature subset selection
Artificial Intelligence - Special issue on relevance
The Random Subspace Method for Constructing Decision Forests
IEEE Transactions on Pattern Analysis and Machine Intelligence
Feature selection for ensembles
AAAI '99/IAAI '99 Proceedings of the sixteenth national conference on Artificial intelligence and the eleventh Innovative applications of artificial intelligence conference innovative applications of artificial intelligence
Unsupervised Feature Selection Applied to Content-Based Retrieval of Lung Images
IEEE Transactions on Pattern Analysis and Machine Intelligence
Clustering with Instance-level Constraints
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Cluster ensembles --- a knowledge reuse framework for combining multiple partitions
The Journal of Machine Learning Research
An introduction to variable and feature selection
The Journal of Machine Learning Research
Diverse ensembles for active learning
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Feature Selection for Unsupervised Learning
The Journal of Machine Learning Research
Efficient Feature Selection via Analysis of Relevance and Redundancy
The Journal of Machine Learning Research
IEEE Transactions on Pattern Analysis and Machine Intelligence
Feature Subset Selection and Feature Ranking for Multivariate Time Series
IEEE Transactions on Knowledge and Data Engineering
Learning a Mahalanobis Metric from Equivalence Constraints
The Journal of Machine Learning Research
Clustering Ensembles: Models of Consensus and Weak Partitions
IEEE Transactions on Pattern Analysis and Machine Intelligence
Rotation Forest: A New Classifier Ensemble Method
IEEE Transactions on Pattern Analysis and Machine Intelligence
Kernel quadratic discriminant analysis for small sample size problem
Pattern Recognition
Feature selection with dynamic mutual information
Pattern Recognition
Feature selection based on loss-margin of nearest neighbor classification
Pattern Recognition
An improvement on floating search algorithms for feature subset selection
Pattern Recognition
Constraint projections for ensemble learning
AAAI'08 Proceedings of the 23rd national conference on Artificial intelligence - Volume 2
Value, cost, and sharing: open issues in constrained clustering
KDID'06 Proceedings of the 5th international conference on Knowledge discovery in inductive databases
A discriminative learning framework with pairwise constraints for video object classification
CVPR'04 Proceedings of the 2004 IEEE computer society conference on Computer vision and pattern recognition
Measuring constraint-set utility for partitional clustering algorithms
PKDD'06 Proceedings of the 10th European conference on Principle and Practice of Knowledge Discovery in Databases
Fuzzy SVM for content-based image retrieval: a pseudo-label support vector machine framework
IEEE Computational Intelligence Magazine
A soft relevance framework in content-based image retrieval systems
IEEE Transactions on Circuits and Systems for Video Technology
Constraint scores for semi-supervised feature selection: A comparative study
Pattern Recognition Letters
Constrained laplacian score for semi-supervised feature selection
ECML PKDD'11 Proceedings of the 2011 European conference on Machine learning and knowledge discovery in databases - Volume Part I
Constraint Score Evaluation for Spectral Feature Selection
Neural Processing Letters
Hi-index | 0.01 |
Constraint Score is a recently proposed method for feature selection by using pairwise constraints which specify whether a pair of instances belongs to the same class or not. It has been shown that the Constraint Score, with only a small amount of pairwise constraints, achieves comparable performance to those fully supervised feature selection methods such as Fisher Score. However, one major disadvantage of the Constraint Score is that its performance is dependent on a good selection on the composition and cardinality of constraint set, which is very challenging in practice. In this work, we address the problem by importing Bagging into Constraint Score and a new method called Bagging Constraint Score (BCS) is proposed. Instead of seeking one appropriate constraint set for single Constraint Score, in BCS we perform multiple Constraint Score, each of which uses a bootstrapped subset of original given constraint set. Diversity analysis on individuals of ensemble shows that resampling pairwise constraints is helpful for simultaneously improving accuracy and diversity of individuals. We conduct extensive experiments on a series of high-dimensional datasets from UCI repository and gene databases, and the experimental results validate the effectiveness of the proposed method.