PDA-SVM Hybrid: A Unified Model for Kernel-Based Supervised Classification

  • Authors:
  • S. Y. Kung;Man-Wai Mak

  • Affiliations:
  • Princeton University, Princeton, USA 08544;The Hong Kong Polytechnic University, Hung Hom, Hongkong

  • Venue:
  • Journal of Signal Processing Systems
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

For most practical supervised learning applications, the training datasets are often linearly nonseparable based on the traditional Euclidean metric. To strive for more effective classification capability, a new and flexible distance metric has to be adopted. There exist a great variety of kernel-based classifiers, each with their own favorable domain of applications. They are all based on a new distance metric induced from a kernel-based inner-product. It is also known that classifier's effectiveness depends strongly on the distribution of training and testing data. The problem lies in that we just do not know in advance the right models for the observation data and measurement noise. As a result, it is impossible to pinpoint an appropriate model for the best tradeoff between the classifier's training accuracy and error resilience. The objective of this paper is to develop a versatile classifier endowed with a broad array of parameters to cope with various kinds of real-world data. More specifically, a so-called PDA-SVM Hybrid is proposed as a unified model for kernel-based supervised classification. This paper looks into the interesting relationship between existing classifiers (such as KDA, PDA, and SVM) and explains why they are special cases of the unified model. It further explores the effects of key parameters on various aspects of error analysis. Finally, simulations were conducted on UCI and biological data and their performance compared.