Matrix computations (3rd ed.)
NETLAB: algorithms for pattern recognition
NETLAB: algorithms for pattern recognition
Pattern Recognition and Machine Learning (Information Science and Statistics)
Pattern Recognition and Machine Learning (Information Science and Statistics)
Pruning RBF networks with QLP decomposition
NN'07 Proceedings of the 8th Conference on 8th WSEAS International Conference on Neural Networks - Volume 8
Comparison of Wiener filter solution by SVD with decompositions QR and QLP
AIKED'07 Proceedings of the 6th Conference on 6th WSEAS Int. Conf. on Artificial Intelligence, Knowledge Engineering and Data Bases - Volume 6
An efficient sequential learning algorithm for growing and pruning RBF (GAP-RBF) networks
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Hi-index | 0.01 |
Radial Basis Function networks with linear output are often used in regression problems because they can be substantially faster to train than Multi-layer Perceptrons. We show how radial base Cauchy, multiquadric and Inverse multiquadric type functions can be used to approximate a rapidly changing continuous test function. In this paper, the performance of the reduced matrix design by QLP decomposition is compared with model selection criteria as the Schwartz Bayesian Information Criterion (BIC). We introduce the concept of linear basis function models and matrix design reduced by QLP decomposition, followed by an application of the QLP methodology to prune networks with different choices of radial basis function. The QLP method proves to be effective for reducing the network size by pruning hidden nodes, resulting is a parsimonious model with accurate prediction of a sinusoidal test function.