A new kernelization framework for Mahalanobis distance learning algorithms

  • Authors:
  • Ratthachat Chatpatanasiri;Teesid Korsrilabutr;Pasakorn Tangchanachaianan;Boonserm Kijsirikul

  • Affiliations:
  • Department of Computer Engineering, Chulalongkorn University, Pathumwan, Bangkok 10330, Thailand;Department of Computer Engineering, Chulalongkorn University, Pathumwan, Bangkok 10330, Thailand;Department of Computer Engineering, Chulalongkorn University, Pathumwan, Bangkok 10330, Thailand;Department of Computer Engineering, Chulalongkorn University, Pathumwan, Bangkok 10330, Thailand

  • Venue:
  • Neurocomputing
  • Year:
  • 2010

Quantified Score

Hi-index 0.01

Visualization

Abstract

This paper focuses on developing a new framework of kernelizing Mahalanobis distance learners. The new KPCA trick framework offers several practical advantages over the classical kernel trick framework, e.g. no mathematical formulas and no reprogramming are required for a kernel implementation, a way to speed up an algorithm is provided with no extra work, the framework avoids troublesome problems such as singularity. Rigorous representer theorems in countably infinite dimensional spaces are given to validate our framework. Furthermore, unlike previous works which always apply brute force methods to select a kernel, we derive a kernel alignment formula based on quadratic programming which can efficiently construct an appropriate kernel for a given dataset.