Original Contribution: Computer aided neural network engineering
Neural Networks
Random subspace method for multivariate feature selection
Pattern Recognition Letters
Feature selection in robust clustering based on Laplace mixture
Pattern Recognition Letters
The peaking phenomenon in the presence of feature-selection
Pattern Recognition Letters
Advanced Engineering Informatics
Efficient update of the covariance matrix inverse in iterated linear discriminant analysis
Pattern Recognition Letters
Feature selection with adjustable criteria
RSFDGrC'05 Proceedings of the 10th international conference on Rough Sets, Fuzzy Sets, Data Mining, and Granular Computing - Volume Part I
Expert Systems with Applications: An International Journal
Is feature selection still necessary?
SLSFS'05 Proceedings of the 2005 international conference on Subspace, Latent Structure and Feature Selection
Hi-index | 0.15 |
In pattern recognition problems it has been noted that beyond a certain point the inclusion of additional parameters (that have been estimated) leads to higher probabilities of error. A simple problem has been formulated where the probability of error approaches zero as the dimensionality increases and all the parameters are known; on the other hand, the probability of error approaches one-half as the dimensionality increases and parameters are estimated.