Capturing nonlinear dependencies in natural images using ICA and mixture of Laplacian distribution

  • Authors:
  • Hyun-Jin Park;Te-Won Lee

  • Affiliations:
  • Institute for Neural Computation (INC), University of California San Diego (UCSD), 9500 Gilman Drive, La Jolla, CA 92093-0523, USA;Institute for Neural Computation (INC), University of California San Diego (UCSD), 9500 Gilman Drive, La Jolla, CA 92093-0523, USA

  • Venue:
  • Neurocomputing
  • Year:
  • 2006

Quantified Score

Hi-index 0.01

Visualization

Abstract

Capturing dependencies in images in an unsupervised manner is important for many image-processing applications and for understanding the structure of natural image signals. Data generative linear models such as principal component analysis (PCA) and independent component analysis (ICA) have shown to capture low level features such as oriented edges in images. However, those models capture only linear dependency and therefore its modeling capability is limited. We propose a new method for capturing nonlinear dependencies in images of natural scenes. This method is an extension of the linear ICA method and builds on a hierarchical representation. The model makes use of lower level linear ICA representation and a subsequent mixture of Laplacian distribution for learning the nonlinear dependencies in an image. The model parameters are learned via the expectation maximization (EM) algorithm and it can accurately capture variance correlation and other high order structures in a simple and consistent manner. We visualize the learned variance correlation structure and demonstrate applications to automatic image segmentation and image denoising.