Intrinsic generalization analysis of low dimensional representations

  • Authors:
  • Xiuwen Liu;Anuj Srivastava;DeLiang Wang

  • Affiliations:
  • Department of Computer Science, Florida State University, Palmetto St. LOVE Building Rm 250, Tallahassee, FL;Department of Statistics, Florida State University, Tallahassee, FL;Department of Computer and Information Science and Center for Cognitive Science, The Ohio State University, Columbus, OH

  • Venue:
  • Neural Networks - 2003 Special issue: Advances in neural networks research — IJCNN'03
  • Year:
  • 2003

Quantified Score

Hi-index 0.00

Visualization

Abstract

Low dimensional representations of images impose equivalence relations in the image space; the induced equivalence class of an image is named as its intrinsic generalization. The intrinsic generalization of a representation provides a novel way to measure its generalization and leads to more fundamental insights than the commonly used recognition performance, which is heavily influenced by the choice of training and test data. We demonstrate the limitations of linear subspace representations by sampling their intrinsic generalization, and propose a nonlinear representation that overcomes these limitations. The proposed representation projects images nonlinearly into the marginal densities of their filter responses, followed by linear projections of the marginals. We use experiments on large datasets to show that the representations that have better intrinsic generalization also lead to better recognition performance.