Hi-index | 0.00 |
In hierarchical learning machines such as neural networks, Bayesian learning provides better generalization performance than maximum likelihood estimation. However, its accurate approximation using Markov chain Monte Carlo (MCMC) method requires huge computational cost. The exchange Monte Carlo (EMC) method was proposed as an improved algorithm of MCMC method. Although its effectiveness has been shown not only in Bayesian learning but also in many fields, the mathematical foundation of EMC method has not yet been established. In this paper, we clarify the asymptotic behavior of symmetrized Kullback divergence and average exchange ratio, which are used as criteria for designing the EMC method.