Learning the optimal neighborhood kernel for classification

  • Authors:
  • Jun Liu;Jianhui Chen;Songcan Chen;Jieping Ye

  • Affiliations:
  • Department of Computer Science and Engineering, Arizona State University and Department of Computer Science and Engineering, Nanjing University of Aeronautics and Astronautics;Department of Computer Science and Engineering, Arizona State University;Department of Computer Science and Engineering, Nanjing University of Aeronautics and Astronautics;Department of Computer Science and Engineering, Arizona State University

  • Venue:
  • IJCAI'09 Proceedings of the 21st international jont conference on Artifical intelligence
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

Kernel methods have been applied successfully in many applications. The kernel matrix plays an important role in kernel-based learning methods, but the "ideal" kernel matrix is usually unknown in practice and needs to be estimated. In this paper, we propose to directly learn the "ideal" kernel matrix (called the optimal neighborhood kernel matrix) from a pre-specified kernel matrix for improved classification performance. We assume that the prespecified kernel matrix generated from the specific application is a noisy observation of the ideal one. The resulting optimal neighborhood kernel matrix is shown to be the summation of the pre-specified kernel matrix and a rank-one matrix. We formulate the problem of learning the optimal neighborhood kernel as a constrained quartic problem, and propose to solve it using two methods: level method and constrained gradient descent. Empirical results on several benchmark data sets demonstrate the efficiency and effectiveness of the proposed algorithms.