Neural Computation
Neurofuzzy adaptive modelling and control
Neurofuzzy adaptive modelling and control
The nature of statistical learning theory
The nature of statistical learning theory
Classification on pairwise proximity data
Proceedings of the 1998 conference on Advances in neural information processing systems II
Neural Networks for Pattern Recognition
Neural Networks for Pattern Recognition
Sparse bayesian learning and the relevance vector machine
The Journal of Machine Learning Research
The evidence framework applied to classification networks
Neural Computation
Fast orthogonal least squares algorithm for efficient subset modelselection
IEEE Transactions on Signal Processing
IEEE Transactions on Neural Networks
Support vector machine multiuser receiver for DS-CDMA signals in multipath channels
IEEE Transactions on Neural Networks
Orthogonal least squares learning algorithm for radial basis function networks
IEEE Transactions on Neural Networks
Orthogonal Shrinkage Methods for Nonparametric Regression under Gaussian Noise
Neural Information Processing
Letters: Prediction error of a fault tolerant neural network
Neurocomputing
L1 LASSO Modeling and Its Bayesian Inference
AI '08 Proceedings of the 21st Australasian Joint Conference on Artificial Intelligence: Advances in Artificial Intelligence
Expert Systems with Applications: An International Journal
Mathematics and Computers in Simulation
Sparse Kernel Learning and the Relevance Units Machine
PAKDD '09 Proceedings of the 13th Pacific-Asia Conference on Advances in Knowledge Discovery and Data Mining
Sparse multinomial kernel discriminant analysis (sMKDA)
Pattern Recognition
IEEE Transactions on Neural Networks
A growing and pruning method for radial basis function networks
IEEE Transactions on Neural Networks
Dynamic engine modeling through linear programming support vector regression
ACC'09 Proceedings of the 2009 conference on American Control Conference
Sparse kernel modelling: a unified approach
IDEAL'07 Proceedings of the 8th international conference on Intelligent data engineering and automated learning
Relevance units latent variable model and nonlinear dimensionality reduction
IEEE Transactions on Neural Networks
Kernel Width Optimization for Faulty RBF Neural Networks with Multi-node Open Fault
Neural Processing Letters
Sparse RBF Networks with Multi-kernels
Neural Processing Letters
Generalization error of faulty MLPs with weight decay regularizer
ICONIP'10 Proceedings of the 17th international conference on Neural information processing: models and applications - Volume Part II
Letters: Training RBF network to tolerate single node fault
Neurocomputing
Grey-box radial basis function modelling
Neurocomputing
Long-term time series prediction using OP-ELM
Neural Networks
Hi-index | 0.01 |
A locally regularized orthogonal least squares (LROLS) algorithm is proposed for constructing parsimonious or sparse regression models that generalize well. By associating each orthogonal weight in the regression model with an individual regularization parameter, the ability for the orthogonal least squares model selection to produce a very sparse model with good generalization performance is greatly enhanced. Furthermore, with the assistance of local regularization, when to terminate the model selection procedure becomes much clearer. A comparison with a state-of-the-art method for constructing sparse regression models, known as the relevance vector machine, is given. The proposed LROLS algorithm is shown to possess considerable computational advantages, including well conditioned solution and faster convergence speed.