Asymptotic error bounds for kernel-based Nyström low-rank approximation matrices

  • Authors:
  • Lo-Bin Chang;Zhidong Bai;Su-Yun Huang;Chii-Ruey Hwang

  • Affiliations:
  • -;-;-;-

  • Venue:
  • Journal of Multivariate Analysis
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

Many kernel-based learning algorithms have the computational load scaled with the sample size n due to the column size of a full kernel Gram matrix K. This article considers the Nystrom low-rank approximation. It uses a reduced kernel K@^, which is nxm, consisting of m columns (say columns i"1,i"2,...,i"m) randomly drawn from K. This approximation takes the form K~K@^U^-^1K@^^T, where U is the reduced mxm matrix formed by rows i"1,i"2,...,i"m of K@^. Often m is much smaller than the sample size n resulting in a thin rectangular reduced kernel, and it leads to learning algorithms scaled with the column size m. The quality of matrix approximations can be assessed by the closeness of their eigenvalues and eigenvectors. In this article, asymptotic error bounds on eigenvalues and eigenvectors are derived for the Nystrom low-rank approximation matrix.