Information Processing Letters
Probabilistic reasoning in intelligent systems: networks of plausible inference
Probabilistic reasoning in intelligent systems: networks of plausible inference
Inferring decision trees using the minimum description length principle
Information and Computation
Models of incremental concept formation
Artificial Intelligence
Elements of information theory
Elements of information theory
Learning Boolean concepts in the presence of many irrelevant features
Artificial Intelligence
Estimating attributes: analysis and extensions of RELIEF
ECML-94 Proceedings of the European conference on machine learning on Machine Learning
Overcoming the Myopia of Inductive Learning Algorithms with RELIEFF
Applied Intelligence
Feature Selection via Discretization
IEEE Transactions on Knowledge and Data Engineering
Occam Algorithms for Computing Visual Motion
IEEE Transactions on Pattern Analysis and Machine Intelligence
A logical account of relevance
IJCAI'95 Proceedings of the 14th international joint conference on Artificial intelligence - Volume 1
Approximate Location of Relevant Variables under the Crossover Distribution
SAGA '01 Proceedings of the International Symposium on Stochastic Algorithms: Foundations and Applications
Evaluating Feature Selection Algorithms
CCIA '02 Proceedings of the 5th Catalonian Conference on AI: Topics in Artificial Intelligence
Finding Essential Attributes from Binary Data
Annals of Mathematics and Artificial Intelligence
Consistency-based search in feature selection
Artificial Intelligence
Discrete Applied Mathematics - Discrete mathematics & data mining (DM & DM)
Efficient Feature Selection via Analysis of Relevance and Redundancy
The Journal of Machine Learning Research
Consistent Feature Selection for Pattern Recognition in Polynomial Time
The Journal of Machine Learning Research
A Model-Based Relevance Estimation Approach for Feature Selection in Microarray Datasets
ICANN '08 Proceedings of the 18th international conference on Artificial Neural Networks, Part II
Artificial Intelligence Review
Feature Selection Using Mutual Information: An Experimental Study
PRICAI '08 Proceedings of the 10th Pacific Rim International Conference on Artificial Intelligence: Trends in Artificial Intelligence
Feature selection with dynamic mutual information
Pattern Recognition
Searching for interacting features in subset selection
Intelligent Data Analysis
Feature subset selection in large dimensionality domains
Pattern Recognition
A Framework for Multi-class Learning in Micro-array Data Analysis
AIME '09 Proceedings of the 12th Conference on Artificial Intelligence in Medicine: Artificial Intelligence in Medicine
Searching for interacting features
IJCAI'07 Proceedings of the 20th international joint conference on Artifical intelligence
Multi-knowledge extraction and application
RSFDGrC'03 Proceedings of the 9th international conference on Rough sets, fuzzy sets, data mining, and granular computing
Ensemble gene selection for cancer classification
Pattern Recognition
Weighted feature extraction with a functional data extension
Neurocomputing
Expert Systems with Applications: An International Journal
Soft fuzzy rough sets for robust feature evaluation and selection
Information Sciences: an International Journal
A hybrid prediction model with F-score feature selection for type II Diabetes databases
Proceedings of the 1st Amrita ACM-W Celebration on Women in Computing in India
Hybrid Bayesian network classifiers: Application to species distribution models
Environmental Modelling & Software
Expert Systems with Applications: An International Journal
Correntropy based feature selection using binary projection
Pattern Recognition
An effective feature selection method using dynamic information criterion
AICI'11 Proceedings of the Third international conference on Artificial intelligence and computational intelligence - Volume Part I
Feature selection using hierarchical feature clustering
Proceedings of the 20th ACM international conference on Information and knowledge management
On the use of variable complementarity for feature selection in cancer classification
EuroGP'06 Proceedings of the 2006 international conference on Applications of Evolutionary Computing
FSKD'05 Proceedings of the Second international conference on Fuzzy Systems and Knowledge Discovery - Volume Part II
FSKD'05 Proceedings of the Second international conference on Fuzzy Systems and Knowledge Discovery - Volume Part II
Fast feature selection aimed at high-dimensional data via hybrid-sequential-ranked searches
Expert Systems with Applications: An International Journal
Feature Reduction with Inconsistency
International Journal of Cognitive Informatics and Natural Intelligence
A scatter method for data and variable importance evaluation
Integrated Computer-Aided Engineering
A novel feature subset selection algorithm based on association rule mining
Intelligent Data Analysis
Hi-index | 0.00 |
The notion of relevance is used in many technical fields. In the areas of machine learning and data mining, for example, relevance is frequently used as a measure in feature subset selection (FSS). In previous studies, the interpretation of relevance has varied and its connection to FSS has been loose. In this paper a rigorous mathematical formalism is proposed for relevance, which is quantitative and normalized. To apply the formalism in FSS, a characterization is proposed for FSS: preservation of learning information and minimization of joint entropy. Based on the characterization, a tight connection between relevance and FSS is established: maximizing the relevance of features to the decision attribute, and the relevance of the decision attribute to the features. This connection is then used to design an algorithm for FSS. The algorithm is linear in the number of instances and quadratic in the number of features. The algorithm is evaluated using 23 public datasets, resulting in an improvement in prediction accuracy on 16 datasets, and a loss in accuracy on only 1 dataset. This provides evidence that both the formalism and its connection to FSS are sound.