Neural Computation
Experimental Bayesian Generalization Error of Non-regular Models under Covariate Shift
Neural Information Processing
Stochastic complexity of bayesian networks
UAI'03 Proceedings of the Nineteenth conference on Uncertainty in Artificial Intelligence
A widely applicable Bayesian information criterion
The Journal of Machine Learning Research
Hi-index | 0.00 |
This paper clarifies learning efficiency of a non-regular parametric model such as a neural network whose true parameter set is an analytic variety with singular points. By using Sato's b-function we rigorously prove that the free energy or the Bayesian stochastic complexity is asymptotically equal to λ1 log n - (m1 - 1) log log n+constant, where λ1 is a rational number, m1 is a natural number, and n is the number of training samples. Also we show an algorithm to calculate λ1 and m1 based on the resolution of singularity. In regular models, 2λ1 is equal to the number of parameters and m1 = 1, whereas in non-regular models such as neural networks, 2λ1 is smaller than the number of parameters and m1 ≥ 1.