Kernel self-optimization learning for kernel-based feature extraction and recognition

  • Authors:
  • Jun-Bao Li;Yun-Heng Wang;Shu-Chuan Chu;John F. Roddick

  • Affiliations:
  • Innovative Information Industry Research Center, Shenzhen Graduate School, Harbin Institute of Technology, China and Department of Automatic Test and Control, Harbin Institute of Technology, China;Innovative Information Industry Research Center, Shenzhen Graduate School, Harbin Institute of Technology, China;School of Computer Science, Engineering and Mathematics, Flinders Univ. of South Aus., Australia;School of Computer Science, Engineering and Mathematics, Flinders Univ. of South Aus., Australia

  • Venue:
  • Information Sciences: an International Journal
  • Year:
  • 2014

Quantified Score

Hi-index 0.07

Visualization

Abstract

Kernel learning is becoming an important research topic in the area of machine learning, and it has wide applications in pattern recognition, computer vision, image and signal processing. Kernel learning provides a promising solution to nonlinear problems, including nonlinear feature extraction, classification and clustering. However, in kernel-based systems, the problem of the kernel function and its parameters remains to be solved. Methods of choosing parameters from a discrete set of values have been presented in previous studies, but these methods do not change the data distribution structure in the kernel-based mapping space. Accordingly, performance is not improved because the current kernel optimization does not change the data distribution. Based on this problem, this paper presents a uniform framework for kernel self-optimization with the ability to adjust the data structure. The data-dependent kernel is extended and applied to kernel learning, and optimization equations with two criteria for measuring data discrimination are used to solve the optimal parameter values. Some experiments are performed to evaluate the performance in popular kernel learning methods, including kernel principal components analysis (KPCA), kernel discriminant analysis (KDA) and kernel locality-preserving projection (KLPP). These evaluations show that the framework of kernel self-optimization is feasible for enhancing kernel-based learning methods.