Modelling nonlinear count time series with local mixtures of Poisson autoregressions
Computational Statistics & Data Analysis
Asymptotic properties of mixture-of-experts models
Neurocomputing
Embedded classification kernel using SOM clustering and mixture of experts
ICECS'05 Proceedings of the 4th WSEAS international conference on Electronics, control and signal processing
Hi-index | 754.84 |
In the class of hierarchical mixtures-of-experts (HME) models, “experts” in the exponential family with generalized linear mean functions of the form ψ(α+xTβ) are mixed, according to a set of local weights called the “gating functions” depending on the predictor x. Here ψ(·) is the inverse link function. We provide regularity conditions on the experts and on the gating functions under which the maximum-likelihood method in the large sample limit produces a consistent and asymptotically normal estimator of the mean response. The regularity conditions are validated for Poisson, gamma, normal, and binomial experts