Core Vector Regression for very large regression problems

  • Authors:
  • Ivor W. Tsang;James T. Kwok;Kimo T. Lai

  • Affiliations:
  • The Hong Kong University of Science and Technology, Kowloon, Hong Kong;The Hong Kong University of Science and Technology, Kowloon, Hong Kong;The Hong Kong University of Science and Technology, Kowloon, Hong Kong

  • Venue:
  • ICML '05 Proceedings of the 22nd international conference on Machine learning
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper, we extend the recently proposed Core Vector Machine algorithm to the regression setting by generalizing the underlying minimum enclosing ball problem. The resultant Core Vector Regression (CVR) algorithm can be used with any linear/nonlinear kernels and can obtain provably approximately optimal solutions. Its asymptotic time complexity is linear in the number of training patterns m, while its space complexity is independent of m. Experiments show that CVR has comparable performance with SVR, but is much faster and produces much fewer support vectors on very large data sets. It is also successfully applied to large 3D point sets in computer graphics for the modeling of implicit surfaces.