Floating search methods in feature selection
Pattern Recognition Letters
Regularization theory and neural networks architectures
Neural Computation
Wrappers for feature subset selection
Artificial Intelligence - Special issue on relevance
Modeling with constructive backpropagation
Neural Networks
Unsupervised Feature Selection Using Feature Similarity
IEEE Transactions on Pattern Analysis and Machine Intelligence
Machine Learning
Feature selection with neural networks
Pattern Recognition Letters
Correlation-based Feature Selection for Discrete and Numeric Class Machine Learning
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Feature Selection for Clustering
PADKK '00 Proceedings of the 4th Pacific-Asia Conference on Knowledge Discovery and Data Mining, Current Issues and New Applications
ADHOC: a Tool for Performing Effective Feature Selection
ICTAI '96 Proceedings of the 8th International Conference on Tools with Artificial Intelligence
An introduction to variable and feature selection
The Journal of Machine Learning Research
Variable selection using svm based criteria
The Journal of Machine Learning Research
Overfitting in making comparisons between variable selection methods
The Journal of Machine Learning Research
Hybrid Genetic Algorithms for Feature Selection
IEEE Transactions on Pattern Analysis and Machine Intelligence
Randomized Variable Elimination
The Journal of Machine Learning Research
Toward Integrating Feature Selection Algorithms for Classification and Clustering
IEEE Transactions on Knowledge and Data Engineering
Random subspace method for multivariate feature selection
Pattern Recognition Letters
Correlation-based Feature Selection Strategy in Neural Classification
ISDA '06 Proceedings of the Sixth International Conference on Intelligent Systems Design and Applications - Volume 01
A hybrid approach for feature subset selection using neural networks and ant colony optimization
Expert Systems with Applications: An International Journal
A hybrid genetic algorithm for feature selection wrapper based on mutual information
Pattern Recognition Letters
A Branch and Bound Algorithm for Feature Subset Selection
IEEE Transactions on Computers
The ANNIGMA-wrapper approach to fast feature selection for neuralnets
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Genetic programming for simultaneous feature selection and classifier design
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
IEEE Transactions on Neural Networks
Neural-network feature selector
IEEE Transactions on Neural Networks
A new evolutionary system for evolving artificial neural networks
IEEE Transactions on Neural Networks
Objective functions for training new hidden units in constructive neural networks
IEEE Transactions on Neural Networks
Modified cascade-correlation learning for classification
IEEE Transactions on Neural Networks
A constructive algorithm for training cooperative neural network ensembles
IEEE Transactions on Neural Networks
A neuro-fuzzy scheme for simultaneous feature selection and fuzzy rule-based classification
IEEE Transactions on Neural Networks
Feature selection in MLPs and SVMs based on maximum output information
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
A new hybrid ant colony optimization algorithm for feature selection
Expert Systems with Applications: An International Journal
Feature evaluation and selection with cooperative game theory
Pattern Recognition
Efficient classifiers for multi-class classification problems
Decision Support Systems
Algorithm learning based neural network integrating feature selection and classification
Expert Systems with Applications: An International Journal
Expert Systems with Applications: An International Journal
Hi-index | 0.02 |
This paper presents a new feature selection (FS) algorithm based on the wrapper approach using neural networks (NNs). The vital aspect of this algorithm is the automatic determination of NN architectures during the FS process. Our algorithm uses a constructive approach involving correlation information in selecting features and determining NN architectures. We call this algorithm as constructive approach for FS (CAFS). The aim of using correlation information in CAFS is to encourage the search strategy for selecting less correlated (distinct) features if they enhance accuracy of NNs. Such an encouragement will reduce redundancy of information resulting in compact NN architectures. We evaluate the performance of CAFS on eight benchmark classification problems. The experimental results show the essence of CAFS in selecting features with compact NN architectures.