Combining labeled and unlabeled data with co-training
COLT' 98 Proceedings of the eleventh annual conference on Computational learning theory
An introduction to variable and feature selection
The Journal of Machine Learning Research
Toward Integrating Feature Selection Algorithms for Classification and Clustering
IEEE Transactions on Knowledge and Data Engineering
Spectral feature selection for supervised and unsupervised learning
Proceedings of the 24th international conference on Machine learning
Discriminative semi-supervised feature selection via manifold regularization
IJCAI'09 Proceedings of the 21st international jont conference on Artifical intelligence
Semi-supervised feature selection for graph classification
Proceedings of the 16th ACM SIGKDD international conference on Knowledge discovery and data mining
Discriminative semi-supervised feature selection via manifold regularization
IEEE Transactions on Neural Networks
Orientation distance-based discriminative feature extraction for multi-class classification
CIKM '10 Proceedings of the 19th ACM international conference on Information and knowledge management
IEA/AIE'10 Proceedings of the 23rd international conference on Industrial engineering and other applications of applied intelligent systems - Volume Part II
Constrained laplacian score for semi-supervised feature selection
ECML PKDD'11 Proceedings of the 2011 European conference on Machine learning and knowledge discovery in databases - Volume Part I
A semi-supervised feature ranking method with ensemble learning
Pattern Recognition Letters
Semi-supervised feature selection using co-occurrent frequent subgraphs
Proceedings of the 7th International Conference on Ubiquitous Information Management and Communication
EEG signal classification using the event-related coherence and genetic algorithm
BICS'13 Proceedings of the 6th international conference on Advances in Brain Inspired Cognitive Systems
Nonlinear dynamic analysis of pathological voices
ICIC'13 Proceedings of the 9th international conference on Intelligent Computing Theories and Technology
Hi-index | 0.00 |
Traditionally, feature selection methods work directly on labeled examples. However, the availability of labeled examples cannot be taken for granted for many real world applications, such as medical diagnosis, forensic science, fraud detection, etc, where labeled examples are hard to find. This practical problem calls the need for "semi-supervised feature selection" to choose the optimal set of features given both labeled and unlabeled examples that return the most accurate classifier for a learning algorithm. In this paper, we introduce a "wrapper-type" forward semi-supervised feature selection framework. In essence, it uses unlabeled examples to extend the initial labeled training set. Extensive experiments on publicly available datasets shows that our proposed framework, generally, outperforms both traditional supervised and state of-the-art "filter-type" semi-supervised feature selection algorithms [5] by 1% to 10% in accuracy.