Scaling up minimum enclosing ball with total soft margin for training on large datasets

  • Authors:
  • Wenjun Hu;Fu-Lai Chung;Shitong Wang;Wenhao Ying

  • Affiliations:
  • School of Digital Media, Jiangnan University, Wuxi, Jiangsu, China and School of Information and Engineering, Huzhou Teachers College, Huzhou, Zhejiang, China;Department of Computing, Hong Kong Polytechnic University, Hong Kong, China;School of Digital Media, Jiangnan University, Wuxi, Jiangsu, China and Department of Computing, Hong Kong Polytechnic University, Hong Kong, China;School of Digital Media, Jiangnan University, Wuxi, Jiangsu, China

  • Venue:
  • Neural Networks
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

Recent research indicates that the standard Minimum Enclosing Ball (MEB) or the center-constrained MEB can be used for effective training on large datasets by employing the core vector machine (CVM) or generalized CVM (GCVM). However, for another extensively-used MEB, i.e., MEB with total soft margin (T-MEB for brevity), we cannot directly employ the CVM or GCVM to realize its fast training for large datasets due to the fact that the involved inequality constraint is violated. In this paper, a fast learning algorithm called FL-TMEB for scaling up T-MEB is presented. First, FL-TMEB slightly relaxes the constraints in TMEB such that it can be equivalent to the corresponding center-constrained MEB, which can be solved with the corresponding Core Set (CS) by CVM. Then, with the help of the sub-optimal solution theorem about T-MEB, FL-TMEB attempts to obtain the extended core set (ECS) by including the neighbors of some samples in the CS into the ECS. Finally, FL-TMEB takes the optimal weights of ECS as the approximation solution of T-MEB. Experimental results on UCI and USPS datasets demonstrate that the proposed method is effective.