Hierarchical mixtures of experts and the EM algorithm
Neural Computation
Self-organization using Potts models
Neural Networks
Independent component analysis using Potts models
IEEE Transactions on Neural Networks
Function approximation using generalized adalines
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
Aside from the Expectation-Maximization (EM) algorithm, Least-Mean-Square (LMS) is devised to further train the model parameters as a complementary training algorithm for Cluster-Weighted Modeling (CWM). Due to different objective functions of EM and LMS, the training result of LMS can be used to reinitialize CWM's model parameters which provides an approach to mitigate local minimum problems.