The nature of statistical learning theory
The nature of statistical learning theory
Artificial Intelligence Review - Special issue on lazy learning
Selection of relevant features and examples in machine learning
Artificial Intelligence - Special issue on relevance
Wrappers for feature subset selection
Artificial Intelligence - Special issue on relevance
Feature subset selection by Bayesian network-based optimization
Artificial Intelligence
Gene functional classification from heterogeneous data
RECOMB '01 Proceedings of the fifth annual international conference on Computational biology
Machine Learning
Choosing Multiple Parameters for Support Vector Machines
Machine Learning
Feature Subset Selection Using a Genetic Algorithm
IEEE Intelligent Systems
Variable selection using svm based criteria
The Journal of Machine Learning Research
Feature selection, L1 vs. L2 regularization, and rotational invariance
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Efficient Feature Selection via Analysis of Relevance and Redundancy
The Journal of Machine Learning Research
IEEE Transactions on Pattern Analysis and Machine Intelligence
Analysis of SVM regression bounds for variable ranking
Neurocomputing
Supervised feature selection via dependence estimation
Proceedings of the 24th international conference on Machine learning
Local-Learning-Based Feature Selection for High-Dimensional Data Analysis
IEEE Transactions on Pattern Analysis and Machine Intelligence
A Variance Minimization Criterion to Feature Selection Using Laplacian Regularization
IEEE Transactions on Pattern Analysis and Machine Intelligence
A solution to the curse of dimensionality problem in pairwise scoring techniques
ICONIP'06 Proceedings of the 13 international conference on Neural Information Processing - Volume Part I
A Top-r Feature Selection Algorithm for Microarray Gene Expression Data
IEEE/ACM Transactions on Computational Biology and Bioinformatics (TCBB)
IEEE Transactions on Neural Networks
Feature Selection Using Probabilistic Prediction of Support Vector Regression
IEEE Transactions on Neural Networks
Hi-index | 0.01 |
This paper presents a novel feature-selection algorithm for data regression with a lot of irrelevant features. The proposed method is based on well-established machine-learning technique without any assumption about the underlying data distribution. The key idea in this method is to decompose an arbitrarily complex nonlinear problem into a set of locally linear ones through local information, and to learn globally feature relevance within the least squares loss framework. In contrast to other feature-selection algorithms for data regression, the learning of this method is efficient since the solution can be readily found through gradient descent with a simple update rule. Experiments on some synthetic and real-world data sets demonstrate the viability of our formulation of the feature-selection problem and the effectiveness of our algorithm.