Approximation and radial-basis-function networks
Neural Computation
Wrappers for feature subset selection
Artificial Intelligence - Special issue on relevance
Neural Networks for Pattern Recognition
Neural Networks for Pattern Recognition
Neural Networks for Modelling and Control of Dynamic Systems: A Practitioner's Handbook
Neural Networks for Modelling and Control of Dynamic Systems: A Practitioner's Handbook
An introduction to variable and feature selection
The Journal of Machine Learning Research
Dimensionality reduction via sparse support vector machines
The Journal of Machine Learning Research
Variable selection using svm based criteria
The Journal of Machine Learning Research
On the Kernel Widths in Radial-Basis Function Networks
Neural Processing Letters
Fast learning in networks of locally-tuned processing units
Neural Computation
Regularization in the selection of radial basis function centers
Neural Computation
Analysis of fast input selection: application in time series prediction
ICANN'06 Proceedings of the 16th international conference on Artificial Neural Networks - Volume Part II
Effective input variable selection for function approximation
ICANN'06 Proceedings of the 16th international conference on Artificial Neural Networks - Volume Part I
Using a mahalanobis-like distance to train radial basis neural networks
IWANN'05 Proceedings of the 8th international conference on Artificial Neural Networks: computational Intelligence and Bioinspired Systems
Combined input variable selection and model complexity control for nonlinear regression
Pattern Recognition Letters
Residual variance estimation in machine learning
Neurocomputing
Hi-index | 0.00 |
Input selection in the nonlinear function approximation is important and difficult problem. Neural networks provide good generalization in many cases, but their interpretability is usually limited. However, the contributions of input variables in the prediction of output would be valuable information in many real world applications. In this work, an input selection algorithm for Radial basis function networks is proposed. The selection of input variables is achieved using a constrained cost function, in which each input dimension is weighted. The constraints are imposed on the values of weights. The proposed algorithm solves a log-barrier reformulation of the original optimization problem. The input selection algorithm was applied to both simulated and benchmark data and obtained results were compelling.