Learning nonlinear manifolds based on mixtures of localized linear manifolds under a self-organizing framework

  • Authors:
  • Huicheng Zheng;Wei Shen;Qionghai Dai;Sanqing Hu;Zhe-Ming Lu

  • Affiliations:
  • School of Information Science and Technology, Sun Yat-sen University, 510275 Guangzhou, China;School of Information Science and Technology, Sun Yat-sen University, 510275 Guangzhou, China;Department of Automation, Tsinghua University, 100084 Beijing, China;School of Biomedical Engineering, Drexel University, PA, 19104, USA;School of Aeronautics and Astronautics, Zhejiang University, 310058 Hangzhou, China

  • Venue:
  • Neurocomputing
  • Year:
  • 2009

Quantified Score

Hi-index 0.01

Visualization

Abstract

This paper presents a neural model which learns low-dimensional nonlinear manifolds embedded in higher-dimensional data space based on mixtures of local linear manifolds under a self-organizing framework. Compared to other similar networks, the local linear manifolds learned by our network have a more localized representation of local data distributions thanks to a new distortion measure, which removes confusion between sub-models that exists in many similar mixture models. Each neuron in the network asymptotically learns a mean vector and a principal subspace of the data in its local region. It is proved that there is no local extremum for each sub-model. Experiments show that the new mixture model is better adapted to nonlinear manifolds of various data distributions than other similar models. The online-learning property of this model is desirable when the data set is very large, when computational efficiency is of paramount importance, or when data are sequentially input. We further show an application of this model to recognition of handwritten digit images based on mixtures of local linear manifolds.