Local convergence analysis of a grouped variable version of coordinate descent
Journal of Optimization Theory and Applications
A resource-allocating network for function interpolation
Neural Computation
Universal approximation using radial-basis-function networks
Neural Computation
Journal of Optimization Theory and Applications
A function estimation approach to sequential learning with neural networks
Neural Computation
Selection of relevant features and examples in machine learning
Artificial Intelligence - Special issue on relevance
Wrappers for feature subset selection
Artificial Intelligence - Special issue on relevance
Model selection in neural networks
Neural Networks
Neural Networks for Pattern Recognition
Neural Networks for Pattern Recognition
Neural Networks for Modelling and Control of Dynamic Systems: A Practitioner's Handbook
Neural Networks for Modelling and Control of Dynamic Systems: A Practitioner's Handbook
Reduced Rank Kernel Ridge Regression
Neural Processing Letters
Ridge Regression Learning Algorithm in Dual Variables
ICML '98 Proceedings of the Fifteenth International Conference on Machine Learning
Some Notes on Alternating Optimization
AFSS '02 Proceedings of the 2002 AFSS International Conference on Fuzzy Systems. Calcutta: Advances in Soft Computing
An introduction to variable and feature selection
The Journal of Machine Learning Research
Dimensionality reduction via sparse support vector machines
The Journal of Machine Learning Research
Overfitting in making comparisons between variable selection methods
The Journal of Machine Learning Research
On the Kernel Widths in Radial-Basis Function Networks
Neural Processing Letters
A tutorial on support vector regression
Statistics and Computing
Neural input selection-A fast model-based approach
Neurocomputing
Fast learning in networks of locally-tuned processing units
Neural Computation
Regularization in the selection of radial basis function centers
Neural Computation
Input selection for radial basis function networks by constrained optimization
ICANN'07 Proceedings of the 17th international conference on Artificial neural networks
A generalized growing and pruning RBF (GGAP-RBF) neural network for function approximation
IEEE Transactions on Neural Networks
Selecting useful features for personal credit risk analysis
International Journal of Business Information Systems
Optimal control location for the customer-oriented design of smart phones
Information Sciences: an International Journal
Hi-index | 0.01 |
Input selection is advantageous in regression problems. It may, for example, decrease the training time of models, reduce measurement costs, and assist in circumventing problems of high dimensionality. Also, the inclusion of useless inputs into the model increases the likelihood of overfitting. Neural networks provide good generalization in many cases, but their interpretability is usually limited. However, selecting a subset of variables and estimating their relative importances would be valuable in many real world applications. In the present work, a simultaneous input and basis function selection method for a radial basis function (RBF) network is proposed. The selection is performed by minimizing a constrained optimization problem, in which sparsity of the network is controlled by two continuous valued shrinkage parameters. Each input dimension is weighted and the constraints are imposed on these weights and the output layer coefficients. Direct and alternating optimization (AO) procedures are presented to solve the problem. The proposed method is applied to simulated and benchmark data. In the comparison with the existing methods, the resulting RBF networks have similar prediction accuracies with the smaller numbers of inputs and basis functions.