Learning translation invariant recognition in massively parallel networks
Volume I: Parallel architectures on PARLE: Parallel Architectures and Languages Europe
Simulated annealing: theory and applications
Simulated annealing: theory and applications
Advances in neural information processing systems 2
A resource-allocating network for function interpolation
Neural Computation
A practical Bayesian framework for backpropagation networks
Neural Computation
A function estimation approach to sequential learning with neural networks
Neural Computation
Regularization theory and neural networks architectures
Neural Computation
The nature of statistical learning theory
The nature of statistical learning theory
Issues in Bayesian analysis of neural network models
Neural Computation
Bayesian radial basis functions of variable dimension
Neural Computation
An application of reversible-jump MCMC to multivariate spherical Gaussian mixtures
NIPS '97 Proceedings of the 1997 conference on Advances in neural information processing systems 10
Model selection by MCMC computation
Signal Processing - Special section on Markov Chain Monte Carlo (MCMC) methods for signal processing
Bayesian Learning for Neural Networks
Bayesian Learning for Neural Networks
Statistics and Computing
Monte Carlo Statistical Methods (Springer Texts in Statistics)
Monte Carlo Statistical Methods (Springer Texts in Statistics)
Sequential Monte Carlo Methods to Train Neural Network Models
Neural Computation
A model selection rule for sinusoids in white Gaussian noise
IEEE Transactions on Signal Processing
Asymptotic MAP criteria for model selection
IEEE Transactions on Signal Processing
Bayesian wavelet networks for nonparametric regression
IEEE Transactions on Neural Networks
Learning a multivariate Gaussian mixture model with the reversible jump MCMC algorithm
Statistics and Computing
Simulated Annealing Using a Reversible Jump Markov Chain Monte Carlo Algorithm for Fuzzy Clustering
IEEE Transactions on Knowledge and Data Engineering
Bayesian neural networks for nonlinear time series forecasting
Statistics and Computing
Multilevel mixture Kalman filter
EURASIP Journal on Applied Signal Processing
Bayesian Basecalling for DNA Sequence Analysis Using Hidden Markov Models
IEEE/ACM Transactions on Computational Biology and Bioinformatics (TCBB)
Journal of Biomedical Informatics
Comparison of Bayesian and regression models in missing enzyme identification
International Journal of Bioinformatics Research and Applications
Joint estimation of source number and DOA using simulated annealing algorithm
Digital Signal Processing
UAI'01 Proceedings of the Seventeenth conference on Uncertainty in artificial intelligence
Bayesian radial basis function neural network
IDEAL'05 Proceedings of the 6th international conference on Intelligent Data Engineering and Automated Learning
Digital Signal Processing
Hi-index | 0.00 |
We propose a hierarchical full Bayesian model for radial basis networks. This model treats the model dimension (number of neurons), model parameters, regularization parameters, and noise parameters as unknown random variables. We develop a reversible-jump Markov chain Monte Carlo (MCMC) method to perform the Bayesian computation. We find that the results obtained using this method are not only better than the ones reported previously, but also appear to be robust with respect to the prior specification. In addition, we propose a novel and computationally efficient reversible-jump MCMC simulated annealing algorithm to optimize neural networks. This algorithm enables us to maximize the joint posterior distribution of the network parameters and the number of basis function. It performs a global search in the joint space of the parameters and number of parameters, thereby surmounting the problem of local minima to a large extent. We show that by calibrating the full hierarchical Bayesian prior, we can obtain the classical Akaike information criterion, Bayesian information criterion, and minimum description length model selection criteria within a penalized likelihood framework. Finally, we present a geometric convergence theorem for the algorithm with homogeneous transition kernel and a convergence theorem for the reversible-jump MCMC simulated annealing method.