An incremental manifold learning algorithm based on the small world model

  • Authors:
  • Lukui Shi;Qingxin Yang;Enhai Liu;Jianwei Li;Yongfeng Dong

  • Affiliations:
  • School of Computer Science and Engineering, Hebei University of Technology, Tianjin, China;School of Electrical Engineering and Automation, Tianjin Polytechnic University, Tianjin, China;School of Computer Science and Engineering, Hebei University of Technology, Tianjin, China;School of Computer Science and Engineering, Hebei University of Technology, Tianjin, China;School of Computer Science and Engineering, Hebei University of Technology, Tianjin, China

  • Venue:
  • LSMS/ICSEE'10 Proceedings of the 2010 international conference on Life system modeling and and intelligent computing, and 2010 international conference on Intelligent computing for sustainable energy and environment: Part I
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

Manifold learning can perform nonlinear dimensionality reduction in the high dimensional space. ISOMAP, LLE, Laplacian Eigenmaps, LTSA and Multilayer autoencoders are representative algorithms. Most of them are only defined on the training sets and are executed as a batch mode. They don't provide a model or a formula to map the new data into the low dimensional space. In this paper, we proposed an incremental manifold learning algorithm based on the small world model, which generalizes ISOMAP to new samples. At first, k nearest neighbors and some faraway points are selected from the training set for each new sample. Then the low dimensional embedding of the new sample is obtained by preserving the geodesic distances between it and those points. Experiments demonstrate that new samples can effectively be projected into the low dimensional space with the presented method and the algorithm has lower complexity.