Selection of relevant features and examples in machine learning
Artificial Intelligence - Special issue on relevance
Wrappers for feature subset selection
Artificial Intelligence - Special issue on relevance
Unsupervised Feature Selection Using Feature Similarity
IEEE Transactions on Pattern Analysis and Machine Intelligence
Neural Networks: A Comprehensive Foundation
Neural Networks: A Comprehensive Foundation
Feature Selection for Knowledge Discovery and Data Mining
Feature Selection for Knowledge Discovery and Data Mining
Input Feature Selection by Mutual Information Based on Parzen Window
IEEE Transactions on Pattern Analysis and Machine Intelligence
An introduction to variable and feature selection
The Journal of Machine Learning Research
Dimensionality reduction via sparse support vector machines
The Journal of Machine Learning Research
IEEE Transactions on Pattern Analysis and Machine Intelligence
Soft Computing - A Fusion of Foundations, Methodologies and Applications
Fast learning in networks of locally-tuned processing units
Neural Computation
Comparing support vector machines with Gaussian kernels to radialbasis function classifiers
IEEE Transactions on Signal Processing
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Orthogonal forward selection and backward elimination algorithms for feature subset selection
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Relevant, irredundant feature selection and noisy example elimination
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
IEEE Transactions on Neural Networks
Using mutual information for selecting features in supervised neural net learning
IEEE Transactions on Neural Networks
CompositeMap: a novel framework for music similarity measure
Proceedings of the 32nd international ACM SIGIR conference on Research and development in information retrieval
Radial Basis Function network learning using localized generalization error bound
Information Sciences: an International Journal
Expert Systems with Applications: An International Journal
IPCM separability ratio for supervised feature selection
SMC'09 Proceedings of the 2009 IEEE international conference on Systems, Man and Cybernetics
A novel dynamic fusion method using localized generalization error model
SMC'09 Proceedings of the 2009 IEEE international conference on Systems, Man and Cybernetics
Dynamic financial distress prediction using instance selection for the disposal of concept drift
Expert Systems with Applications: An International Journal
On-line multi-stage sorting algorithm for agriculture products
Pattern Recognition
Financial distress prediction using support vector machines: Ensemble vs. individual
Applied Soft Computing
Hi-index | 0.01 |
A pattern classification problem usually involves using high-dimensional features that make the classifier very complex and difficult to train. With no feature reduction, both training accuracy and generalization capability will suffer. This paper proposes a novel hybrid filter-wrapper-type feature subset selection methodology using a localized generalization error model. The localized generalization error model for a radial basis function neural network bounds from above the generalization error for unseen samples located within a neighborhood of the training samples. Iteratively, the feature making the smallest contribution to the generalization error bound is removed. Moreover, the novel feature selection method is independent of the sample size and is computationally fast. The experimental results show that the proposed method consistently removes large percentages of features with statistically insignificant loss of testing accuracy for unseen samples. In the experiments for two of the datasets, the classifiers built using feature subsets with 90% of features removed by our proposed approach yield average testing accuracies higher than those trained using the full set of features. Finally, we corroborate the efficacy of the model by using it to predict corporate bankruptcies in the US.