Kernel Optimal Component Analysis

  • Authors:
  • Qiang Zhang;Xiuwen Liu

  • Affiliations:
  • Florida State University, Tallahassee, FL;Florida State University, Tallahassee, FL

  • Venue:
  • CVPRW '04 Proceedings of the 2004 Conference on Computer Vision and Pattern Recognition Workshop (CVPRW'04) Volume 6 - Volume 06
  • Year:
  • 2004

Quantified Score

Hi-index 0.00

Visualization

Abstract

Optimal component analysis (OCA) provides a general sub-space formulation that has many applications. Within the framework of linear representations, OCA poses the problem of finding the optimal representations as an optimization one on the underlying manifold such as Grassmann and a stochastic optimization algorithm can then be used to derive optimal representations for recognition and other applications. However, in many applications, as the underlying manifold is intrinsically nonlinear, the effectiveness of linear representations and thus OCA can be limited. To overcome this fundamental limitation, in this paper we propose a kernelized version of optimal component analysis. The basic idea is to (potentially) account the nonlinearity in the feature space through a nonlinear feature mapping so that linear representations in the resulting feature space can be used effectively for nonlinear problems in the given space. The computational complexity associated with the mapping is overcome by performing the mapping implicitly using a property of reproducing kernel Hilbert space. Therefore, kernel optimal component analysis provides a general method to learn application-dependent representations, either linear or nonlinear and a stochastic effective algorithm is presented. Experimental results for recognition show the feasibility and effectiveness of the proposed method.