Feature Selection via Concave Minimization and Support Vector Machines
ICML '98 Proceedings of the Fifteenth International Conference on Machine Learning
Grafting: fast, incremental feature selection by gradient descent in function space
The Journal of Machine Learning Research
Variable selection using svm based criteria
The Journal of Machine Learning Research
Use of the zero norm with linear models and kernel methods
The Journal of Machine Learning Research
Combined SVM-Based Feature Selection and Classification
Machine Learning
Feature Extraction: Foundations and Applications (Studies in Fuzziness and Soft Computing)
Feature Extraction: Foundations and Applications (Studies in Fuzziness and Soft Computing)
A wrapper method for feature selection using Support Vector Machines
Information Sciences: an International Journal
Model Selection: Beyond the Bayesian/Frequentist Divide
The Journal of Machine Learning Research
Simultaneous feature selection and classification using kernel-penalized support vector machines
Information Sciences: an International Journal
Hi-index | 0.00 |
Recently, databases have incremented their size in all areas of knowledge, considering both the number of instances and attributes. Current data sets may handle hundreds of thousands of variables with a high level of redundancy and/or irrelevancy. This amount of data may cause several problems to many data mining algorithms in terms of performance and scalability. In this work we present the state-of-the-art the for embedded feature selection using the classification method Support Vector Machine (SVM), presenting two additional works that can handle the new challenges in this area, such as simultaneous feature and model selection and highly imbalanced binary classification. We compare our approaches with other state-of-the-art algorithms to demonstrate their effectiveness and efficiency.