Feature Selection: Evaluation, Application, and Small Sample Performance
IEEE Transactions on Pattern Analysis and Machine Intelligence
Wrappers for feature subset selection
Artificial Intelligence - Special issue on relevance
Normalized Cuts and Image Segmentation
IEEE Transactions on Pattern Analysis and Machine Intelligence
From Few to Many: Illumination Cone Models for Face Recognition under Variable Lighting and Pose
IEEE Transactions on Pattern Analysis and Machine Intelligence
Neural Networks for Pattern Recognition
Neural Networks for Pattern Recognition
Unsupervised Feature Selection Applied to Content-Based Retrieval of Lung Images
IEEE Transactions on Pattern Analysis and Machine Intelligence
An introduction to variable and feature selection
The Journal of Machine Learning Research
Feature Selection for Unsupervised Learning
The Journal of Machine Learning Research
Toward Integrating Feature Selection Algorithms for Classification and Clustering
IEEE Transactions on Knowledge and Data Engineering
Spectral Segmentation with Multiscale Graph Decomposition
CVPR '05 Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05) - Volume 2 - Volume 02
Learning a Mahalanobis Metric from Equivalence Constraints
The Journal of Machine Learning Research
Semi-supervised clustering: probabilistic models, algorithms and experiments
Semi-supervised clustering: probabilistic models, algorithms and experiments
Musical instrument recognition by pairwise classification strategies
IEEE Transactions on Audio, Speech, and Language Processing
Exploring the boundary region of tolerance rough sets for feature selection
Pattern Recognition
Feature selection with dynamic mutual information
Pattern Recognition
Bagging Constraint Score for feature selection with pairwise constraints
Pattern Recognition
The effect of linguistic hedges on feature selection: Part 2
Expert Systems with Applications: An International Journal
IITA'09 Proceedings of the 3rd international conference on Intelligent information technology application
Nearest-neighbor guided evaluation of data reliability and its applications
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Constraint scores for semi-supervised feature selection: A comparative study
Pattern Recognition Letters
Software fault localization via mining execution graphs
ICCSA'11 Proceedings of the 2011 international conference on Computational science and its applications - Volume Part II
Constrained laplacian score for semi-supervised feature selection
ECML PKDD'11 Proceedings of the 2011 European conference on Machine learning and knowledge discovery in databases - Volume Part I
MultiCost: multi-stage cost-sensitive classification of Alzheimer's disease
MLMI'11 Proceedings of the Second international conference on Machine learning in medical imaging
Improving constrained clustering with active query selection
Pattern Recognition
Feature evaluation and selection with cooperative game theory
Pattern Recognition
An unsupervised feature selection framework based on clustering
PAKDD'11 Proceedings of the 15th international conference on New Frontiers in Applied Data Mining
Local-to-global semi-supervised feature selection
Proceedings of the 22nd ACM international conference on Conference on information & knowledge management
Constraint Score Evaluation for Spectral Feature Selection
Neural Processing Letters
Hi-index | 0.01 |
Feature selection is an important preprocessing step in mining high-dimensional data. Generally, supervised feature selection methods with supervision information are superior to unsupervised ones without supervision information. In the literature, nearly all existing supervised feature selection methods use class labels as supervision information. In this paper, we propose to use another form of supervision information for feature selection, i.e. pairwise constraints, which specifies whether a pair of data samples belong to the same class (must-link constraints) or different classes (cannot-link constraints). Pairwise constraints arise naturally in many tasks and are more practical and inexpensive than class labels. This topic has not yet been addressed in feature selection research. We call our pairwise constraints guided feature selection algorithm as Constraint Score and compare it with the well-known Fisher Score and Laplacian Score algorithms. Experiments are carried out on several high-dimensional UCI and face data sets. Experimental results show that, with very few pairwise constraints, Constraint Score achieves similar or even higher performance than Fisher Score with full class labels on the whole training data, and significantly outperforms Laplacian Score.