Wrappers for feature subset selection
Artificial Intelligence - Special issue on relevance
Feature selection in unsupervised learning via evolutionary search
Proceedings of the sixth ACM SIGKDD international conference on Knowledge discovery and data mining
An introduction to variable and feature selection
The Journal of Machine Learning Research
Dimensionality Reduction for Supervised Learning with Reproducing Kernel Hilbert Spaces
The Journal of Machine Learning Research
Efficient Feature Selection via Analysis of Relevance and Redundancy
The Journal of Machine Learning Research
A dependence maximization view of clustering
Proceedings of the 24th international conference on Machine learning
Supervised feature selection via dependence estimation
Proceedings of the 24th international conference on Machine learning
Markov blanket feature selection for support vector machines
AAAI'08 Proceedings of the 23rd national conference on Artificial intelligence - Volume 2
Sensitivities: an alternative to conditional probabilities for Bayesian belief networks
UAI'95 Proceedings of the Eleventh conference on Uncertainty in artificial intelligence
Measuring statistical dependence with hilbert-schmidt norms
ALT'05 Proceedings of the 16th international conference on Algorithmic Learning Theory
Hi-index | 0.00 |
The proposed feature selection method aims to find a minimum subset of the most informative variables for classification/regression by efficiently approximating the Markov Blanket which is a set of variables that can shield a certain variable from the target. Instead of relying on the conditional independence test or network structure learning, the new method uses Hilbert-Schmidt Independence criterion as a measure of dependence among variables in a kernel-induced space. This allows effective approximation of the Markov Blanket that consists of multiple dependent features rather than being limited to a single feature. In addition, the new method can remove both irrelevant and redundant features at the same time. This method for discovering the Markov Blanket is applicable to both discrete and continuous variables, whereas previous methods cannot be used directly for continuous features and therefore are not applicable to regression problems. Experimental evaluations on synthetic and benchmark classification and regression datasets provide evidence that the new feature selection method can remove useless variables in low and in high dimensional problems more accurately than existing Markov Blanket based alternatives.