A note on mixtures of experts for multiclass responses: approximation rate and Consistent Bayesian Inference

  • Authors:
  • Yang Ge;Wenxin Jiang

  • Affiliations:
  • Northwestern University, Evanston, IL;Northwestern University, Evanston, IL

  • Venue:
  • ICML '06 Proceedings of the 23rd international conference on Machine learning
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

We report that mixtures of m multinomial logistic regression can be used to approximate a class of 'smooth' probability models for multiclass responses. With bounded second derivatives of log-odds, the approximation rate is O(m-2/s) in Hellinger distance or O(m-4/s) in Kullback-Leibler divergence. Here s = dim(x) is the dimension of the input space (or the number of predictors). With the availability of training data of size n, we also show that 'consistency' in multiclass regression and classification can be achieved, simultaneously for all classes, when posterior based inference is performed in a Bayesian framework. Loosely speaking, such 'consistency' refers to performance being often close to the best possible for large n. Consistency can be achieved either by taking m = mn, or by taking m to be uniformly distributed among {1, ...,mn} according to the prior, where 1 ≺ mn ≺ na in order as n grows, for some a ∈ (0, 1).