Building sparse representations and structure determination on LS-SVM substrates

  • Authors:
  • K. Pelckmans;J. A. K. Suykens;B. De Moor

  • Affiliations:
  • K.U. Leuven, ESAT-SCD/SISTA, Kasteelpark Arenberg 10, B-3001 Leuven (Heverlee), Belgium;K.U. Leuven, ESAT-SCD/SISTA, Kasteelpark Arenberg 10, B-3001 Leuven (Heverlee), Belgium;K.U. Leuven, ESAT-SCD/SISTA, Kasteelpark Arenberg 10, B-3001 Leuven (Heverlee), Belgium

  • Venue:
  • Neurocomputing
  • Year:
  • 2005

Quantified Score

Hi-index 0.01

Visualization

Abstract

This paper proposes a new method to obtain sparseness and structure detection for a class of kernel machines related to least-squares support vector machines (LS-SVMs). The key method is to adopt an hierarchical modeling strategy. Here, the first level consists of an LS-SVM substrate which is based upon an LS-SVM formulation with additive regularization trade-off. This regularization trade-off is determined at higher levels such that sparse representations and/or structure detection are obtained. Using the necessary and sufficient conditions for optimality given by the Karush-Kuhn-Tucker conditions, one can guide the interaction between different levels via a well-defined set of hyper-parameters. From a computational point of view, all levels can be fused into a single convex optimization problem. Furthermore, the principle is applied in order to optimize the validation performance of the resulting kernel machine. Sparse representations as well as structure detection are obtained, respectively, by using an L"1 regularization scheme and a measure of maximal variation at the second level. A number of case studies indicate the usefulness of these approaches both with respect to interpretability of the final model as well as generalization performance.