Comments on “Pruning Error Minimization in Least Squares Support Vector Machines”

  • Authors:
  • A. Kuh;P. De Wilde

  • Affiliations:
  • Univ. of Hawaii, Honolulu;-

  • Venue:
  • IEEE Transactions on Neural Networks
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this letter, we comment on "Pruning Error Minimization in Least Squares Support Vector Machines" by B. J. de Kruif and T. J. A. de Vries. The original paper proposes a way of pruning training examples for least squares support vector machines (LS SVM) using no regularization (gamma = infin). This causes a problem as the derivation involves inverting a matrix that is often singular. We discuss a modification of this algorithm that prunes with regularization (gamma finite and nonzero) and is also computationally more efficient.