Hierarchical mixtures of experts and the EM algorithm
Neural Computation
On the identifiability of mixtures-of-experts
Neural Networks
Modeling with Mixtures of Linear Regressions
Statistics and Computing
On Consistency of Bayesian Inference with Mixtures of Logistic Regression
Neural Computation
Interpretation and inference in mixture models: Simple MCMC works
Computational Statistics & Data Analysis
Modelling nonlinear count time series with local mixtures of Poisson autoregressions
Computational Statistics & Data Analysis
Adaptive mixtures of local experts
Neural Computation
Mixtures of regressions with predictor-dependent mixing proportions
Computational Statistics & Data Analysis
Error bounds for functional approximation and estimation using mixtures of experts
IEEE Transactions on Information Theory
Mixtures-of-experts of autoregressive time series: asymptotic normality and model specification
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
In this letter, we consider a mixture-of-experts structure where m experts are mixed, with each expert being related to a polynomial regression model of order k. We study the convergence rate of the maximum likelihood estimator in terms of how fast the Hellinger distance of the estimated density converges to the true density, when the sample size n increases. The convergence rate is found to be dependent on both m and k, while certain choices of m and k are found to produce near-optimal convergence rates.