Combining DC Algorithms (DCAs) and Decomposition Techniques for the Training of Nonpositive–Semidefinite Kernels

  • Authors:
  • F. B. Akoa

  • Affiliations:
  • Dept. of Planning, AES Sonel, Douala

  • Venue:
  • IEEE Transactions on Neural Networks
  • Year:
  • 2008

Quantified Score

Hi-index 0.00

Visualization

Abstract

Today, decomposition methods are one of the most popular methods for training support vector machines (SVMs). With the use of kernels that do not satisfy Mercer's condition, new techniques must be designed to handle nonpositive-semidefinite kernels resulting to this choice. In this work we incorporate difference of convex (DC functions) optimization techniques into decomposition methods to tackle this difficulty. The new approach needs no problem modification and we show that the only use of a truncated DC algorithms (DCAs) in the decomposition scheme produces a sufficient decrease of the objective function at each iteration. Thanks to this property, an asymptotic convergence proof of the new algorithm is produced without any blockwise convexity assumption on the objective function. We also investigate a working set selection rule using second-order information for sequential minimal optimization (SMO)-type decomposition in the spirit of DC optimization. Numerical results show the robustness and the efficiency of the new methods compared with state-of-the-art software.