Advances in neural information processing systems 2
Estimating attributes: analysis and extensions of RELIEF
ECML-94 Proceedings of the European conference on machine learning on Machine Learning
Generalization performance of support vector machines and other pattern classifiers
Advances in kernel methods
Least Squares Support Vector Machine Classifiers
Neural Processing Letters
An Adaptive Version of the Boost by Majority Algorithm
Machine Learning
Feature Selection for Clustering - A Filter Solution
ICDM '02 Proceedings of the 2002 IEEE International Conference on Data Mining
An extensive empirical study of feature selection metrics for text classification
The Journal of Machine Learning Research
Grafting: fast, incremental feature selection by gradient descent in function space
The Journal of Machine Learning Research
Consistency-based search in feature selection
Artificial Intelligence
Margin based feature selection - theory and algorithms
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Gradient LASSO for feature selection
ICML '04 Proceedings of the twenty-first international conference on Machine learning
IEEE Transactions on Pattern Analysis and Machine Intelligence
On Model Selection Consistency of Lasso
The Journal of Machine Learning Research
Iterative RELIEF for Feature Weighting: Algorithms, Theories, and Applications
IEEE Transactions on Pattern Analysis and Machine Intelligence
Development of Two-Stage SVM-RFE Gene Selection Strategy for Microarray Expression Data Analysis
IEEE/ACM Transactions on Computational Biology and Bioinformatics (TCBB)
Bioinformatics
Stepwise feature selection using generalized logistic loss
Computational Statistics & Data Analysis
Feature selection in bankruptcy prediction
Knowledge-Based Systems
Feature selection with dynamic mutual information
Pattern Recognition
Large-scale sparse logistic regression
Proceedings of the 15th ACM SIGKDD international conference on Knowledge discovery and data mining
Large Margin Feature Weighting Method via Linear Programming
IEEE Transactions on Knowledge and Data Engineering
A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
SIAM Journal on Imaging Sciences
A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
SIAM Journal on Imaging Sciences
Stochastic gradient descent training for L1-regularized log-linear models with cumulative penalty
ACL '09 Proceedings of the Joint Conference of the 47th Annual Meeting of the ACL and the 4th International Joint Conference on Natural Language Processing of the AFNLP: Volume 1 - Volume 1
Effective feature selection scheme using mutual information
Neurocomputing
Soft fuzzy rough sets for robust feature evaluation and selection
Information Sciences: an International Journal
Stochastic Methods for l1-regularized Loss Minimization
The Journal of Machine Learning Research
Large-margin feature selection for monotonic classification
Knowledge-Based Systems
Improving the ranking quality of medical image retrieval using a genetic feature selection method
Decision Support Systems
Hi-index | 0.00 |
Feature selection is an important preprocessing step in machine learning and pattern recognition. It is also a data mining task in some real-world applications. Feature quality evaluation is a key issue when designing an algorithm for feature selection. The classification margin has been used widely to evaluate feature quality in recent years. In this study, we introduce a robust loss function, called Brownboost loss, which computes the feature quality and selects the optimal feature subsets to enhance robustness. We compute the classification loss in a feature space with hypothesis-margin and minimize the loss by optimizing the weights of features. An algorithm is developed based on gradient descent using L"2-norm regularization techniques. The proposed algorithm is tested using UCI datasets and gene expression datasets, respectively. The experimental results show that the proposed algorithm is effective in improving the classification robustness.