Elements of information theory
Elements of information theory
A practical approach to feature selection
ML92 Proceedings of the ninth international workshop on Machine learning
Selection of relevant features and examples in machine learning
Artificial Intelligence - Special issue on relevance
Wrappers for feature subset selection
Artificial Intelligence - Special issue on relevance
Data mining: practical machine learning tools and techniques with Java implementations
Data mining: practical machine learning tools and techniques with Java implementations
Statistical Pattern Recognition: A Review
IEEE Transactions on Pattern Analysis and Machine Intelligence
A Comparative Study on Feature Selection in Text Categorization
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
Filters, Wrappers and a Boosting-Based Hybrid for Feature Selection
ICML '01 Proceedings of the Eighteenth International Conference on Machine Learning
An introduction to variable and feature selection
The Journal of Machine Learning Research
An extensive empirical study of feature selection metrics for text classification
The Journal of Machine Learning Research
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
International Journal of Computer Vision - Special Issue on Content-Based Image Retrieval
Consistency-based search in feature selection
Artificial Intelligence
Toward Integrating Feature Selection Algorithms for Classification and Clustering
IEEE Transactions on Knowledge and Data Engineering
IEEE Transactions on Pattern Analysis and Machine Intelligence
Feature combination using boosting
Pattern Recognition Letters
Feature selection for ensembles applied to handwriting recognition
International Journal on Document Analysis and Recognition
Feature Subset Selection and Ranking for Data Dimensionality Reduction
IEEE Transactions on Pattern Analysis and Machine Intelligence
Consensus unsupervised feature ranking from multiple views
Pattern Recognition Letters
A review of feature selection techniques in bioinformatics
Bioinformatics
Feature selection with dynamic mutual information
Pattern Recognition
Effective feature selection scheme using mutual information
Neurocomputing
Conditional mutual information based feature selection for classification task
CIARP'07 Proceedings of the Congress on pattern recognition 12th Iberoamerican conference on Progress in pattern recognition, image analysis and applications
ICAPR'05 Proceedings of the Third international conference on Advances in Pattern Recognition - Volume Part I
Combining feature subsets in feature selection
MCS'05 Proceedings of the 6th international conference on Multiple Classifier Systems
Boost feature subset selection: a new gene selection algorithm for microarray dataset
ICCS'06 Proceedings of the 6th international conference on Computational Science - Volume Part II
Improving text classification with concept index terms and expansion terms
ISNN'11 Proceedings of the 8th international conference on Advances in neural networks - Volume Part III
Feature evaluation and selection with cooperative game theory
Pattern Recognition
Accurate Prediction of Coronary Artery Disease Using Reliable Diagnosis System
Journal of Medical Systems
Hi-index | 0.01 |
Feature selection plays an important role in pattern classification. Its purpose is to remove redundant features from data set as many as possible. The presence of useless features may not only deteriorate the performance of learning algorithms, but also obscure important information (e.g., intrinsic structure) behind data. Along with new and emerging techniques, data sets in many domains are becoming larger and larger and many irrelevant features are often prevailing in these data sets. This, however, poses great challenges to traditional learning algorithms, such as low efficiency and over-fitting. Thus, it becomes apparent that an efficient technique is needed to eliminate redundant or irrelevant features from the data sets. Currently, many endeavors to cope with this problem have been attempted and various outstanding feature selection methods have been proposed. Unlike other selection methods, in this paper we propose a general scheme of boosting feature selection method using information metric. The primary characteristic of our method is that it exploits weight of data to select salient features. Furthermore, the weight of data will be dynamically changed after each candidate feature has been selected. Thus, the information criteria used in feature selector can exactly represent the relevant degree between features and the class labels. As a result, the selected feature subset has maximal relevance to the class labels. Simulation studies carried out on UCI data sets show that the classification performance achieved by our proposed method is better than those of other selection methods in most cases.