Algebraic geometrical methods for hierarchical learning machines
Neural Networks
Algebraic Analysis for Nonidentifiable Learning Machines
Neural Computation
Asymptotic Convergence Rate of the EM Algorithm for Gaussian Mixtures
Neural Computation
Singularities Affect Dynamics of Learning in Neuromanifolds
Neural Computation
Dynamics of learning near singularities in layered networks
Neural Computation
On convergence properties of the em algorithm for gaussian mixtures
Neural Computation
Unsupervised topographic learning for spatiotemporal data mining
Advances in Artificial Intelligence - Special issue on machine learning paradigms for modeling spatial and temporal information in multimedia data mining
A multi-threshold segmentation approach based on Artificial Bee Colony optimization
Applied Intelligence
A comparison of nature inspired algorithms for multi-threshold image segmentation
Expert Systems with Applications: An International Journal
Hi-index | 0.00 |
Singularities in the parameter spaces of hierarchical learning machines are known to be a main cause of slow convergence of gradient descent learning. The EM algorithm, which is another learning algorithm giving a maximum likelihood estimator, is also suffering from its slow convergence, which often appears when the component overlap is large. We analyze the dynamics of the EM algorithm for Gaussian mixtures around singularities and show that there exists a slow manifold caused by a singular structure, which is closely related to the slow convergence of the EM algorithm. We also conduct numerical simulations to confirm the theoretical analysis. Through the simulations, we compare the dynamics of the EM algorithm with that of the gradient descent algorithm, and show that their slow dynamics are caused by the same singular structure, and thus they have the same behaviors around singularities.