Visual reconstruction
Minimizing GCV/GML scores with multiple smoothing parameters via the Newton method
SIAM Journal on Scientific and Statistical Computing
Neural Computation
Bayesian modeling and classification of neural signals
Neural Computation
Bayesian Learning for Neural Networks
Bayesian Learning for Neural Networks
On the Use of Evidence in Neural Networks
Advances in Neural Information Processing Systems 5, [NIPS Conference]
Hi-index | 0.00 |
A traditional interpolation model is characterized by the choice of regularizer applied to the interpolant, and the choice of noise model. Typically, the regularizer has a single regularization constant α, and the noise model has a single parameter β. The ratio α/β alone is responsible for determining globally all these attributes of the interpolant: its ’complexity‘, ’flexibility‘, ’smoothness‘, ’characteristic scale length‘, and ’characteristic amplitude‘. We suggest that interpolation models should be able to capture more than just one flavour of simplicity and complexity. We describe Bayesian models in which the interpolant has a smoothness that varies spatially. We emphasize the importance, in practical implementation, of the concept of ’conditional convexity‘ when designing models with many hyperparameters. We apply the new models to the interpolation of neuronal spike data and demonstrate a substantial improvement in generalization error.