Self-organizing mixture networks for probability density estimation

  • Authors:
  • H. Yin;N. M. Allinson

  • Affiliations:
  • Dept. of Electr. Eng. & Electron., Univ. of Manchester Inst. of Sci. & Technol.;-

  • Venue:
  • IEEE Transactions on Neural Networks
  • Year:
  • 2001

Quantified Score

Hi-index 0.00

Visualization

Abstract

A self-organizing mixture network (SOMN) is derived for learning arbitrary density functions. The network minimizes the Kullback-Leibler information metric by means of stochastic approximation methods. The density functions are modeled as mixtures of parametric distributions. A mixture needs not to be homogenous, i.e., it can have different density profiles. The first layer of the network is similar to Kohonen's self-organizing map (SOM), but with the parameters of the component densities as the learning weights. The winning mechanism is based on maximum posterior probability, and updating of the weights is limited to a small neighborhood around the winner. The second layer accumulates the responses of these local nodes, weighted by the learned mixing parameters. The network possesses a simple structure and computational form, yet yields fast and robust convergence. The network has a generalization ability due to the relative entropy criterion used. Applications to density profile estimation and pattern classification are presented. The SOMN can also provide an insight to the role of neighborhood function used in the SOM