Learning Boolean concepts in the presence of many irrelevant features
Artificial Intelligence
Estimating attributes: analysis and extensions of RELIEF
ECML-94 Proceedings of the European conference on machine learning on Machine Learning
The nature of statistical learning theory
The nature of statistical learning theory
A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
Feature Selection for Knowledge Discovery and Data Mining
Feature Selection for Knowledge Discovery and Data Mining
Correlation-based Feature Selection for Discrete and Numeric Class Machine Learning
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
An introduction to variable and feature selection
The Journal of Machine Learning Research
Extensions to metric based model selection
The Journal of Machine Learning Research
Dimensionality reduction via sparse support vector machines
The Journal of Machine Learning Research
Feature Extraction: Foundations and Applications (Studies in Fuzziness and Soft Computing)
Feature Extraction: Foundations and Applications (Studies in Fuzziness and Soft Computing)
On Model Selection Consistency of Lasso
The Journal of Machine Learning Research
Searching for interacting features
IJCAI'07 Proceedings of the 20th international joint conference on Artifical intelligence
A simple bayesian algorithm for feature ranking in high dimensional regression problems
AI'11 Proceedings of the 24th international conference on Advances in Artificial Intelligence
Nonparametric feature screening
Computational Statistics & Data Analysis
On selecting interacting features from high-dimensional data
Computational Statistics & Data Analysis
Stationary-sparse causality network learning
The Journal of Machine Learning Research
Hi-index | 0.00 |
Variable selection in high-dimensional space characterizes many contemporary problems in scientific discovery and decision making. Many frequently-used techniques are based on independence screening; examples include correlation ranking (Fan & Lv, 2008) or feature selection using a two-sample t-test in high-dimensional classification (Tibshirani et al., 2003). Within the context of the linear model, Fan & Lv (2008) showed that this simple correlation ranking possesses a sure independence screening property under certain conditions and that its revision, called iteratively sure independent screening (ISIS), is needed when the features are marginally unrelated but jointly related to the response variable. In this paper, we extend ISIS, without explicit definition of residuals, to a general pseudo-likelihood framework, which includes generalized linear models as a special case. Even in the least-squares setting, the new method improves ISIS by allowing feature deletion in the iterative process. Our technique allows us to select important features in high-dimensional classification where the popularly used two-sample t-method fails. A new technique is introduced to reduce the false selection rate in the feature screening stage. Several simulated and two real data examples are presented to illustrate the methodology.