Optimizing the kernel in the empirical feature space

  • Authors:
  • Huilin Xiong;M. N.S. Swamy;M. O. Ahmad

  • Affiliations:
  • Dept. of Electr. & Comput. Eng., Concordia Univ., Montreal, Que., Canada;-;-

  • Venue:
  • IEEE Transactions on Neural Networks
  • Year:
  • 2005

Quantified Score

Hi-index 0.01

Visualization

Abstract

In this paper, we present a method of kernel optimization by maximizing a measure of class separability in the empirical feature space, an Euclidean space in which the training data are embedded in such a way that the geometrical structure of the data in the feature space is preserved. Employing a data-dependent kernel, we derive an effective kernel optimization algorithm that maximizes the class separability of the data in the empirical feature space. It is shown that there exists a close relationship between the class separability measure introduced here and the alignment measure defined recently by Cristianini. Extensive simulations are carried out which show that the optimized kernel is more adaptive to the input data, and leads to a substantial, sometimes significant, improvement in the performance of various data classification algorithms.