Block-quantized support vector ordinal regression

  • Authors:
  • Bin Zhao;Fei Wang;Changshui Zhang

  • Affiliations:
  • State Key Laboratory of Intelligent Technology and Systems, Tsinghua National Laboratory for Information Science and Technology, Department of Automation, Tsinghua University, Beijing, China;State Key Laboratory of Intelligent Technology and Systems, Tsinghua National Laboratory for Information Science and Technology, Department of Automation, Tsinghua University, Beijing, China;State Key Laboratory of Intelligent Technology and Systems, Tsinghua National Laboratory for Information Science and Technology, Department of Automation, Tsinghua University, Beijing, China

  • Venue:
  • IEEE Transactions on Neural Networks
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

Support vector ordinal regression (SVOR) is a recently proposed ordinal regression (OR) algorithm. Despite its theoretical and empirical success, the method has one major bottleneck, which is the high computational complexity. In this brief, we propose a both practical and theoretical guaranteed algorithm, block-quantized support vector ordinal regression (BQSVOR), where we approximate the kernel matrix K with K that is composed of k2 constant blocks. We provide detailed theoretical justification on the approximation accuracy of BQSVOR. Moreover, we prove theoretically that the OR problem with the block-quantized kernel matrix K could be solved by first separating the data samples in the training set into k clusters with kernel k-means and then performing SVOR on the k cluster representatives. Hence, the algorithm leads to an optimization problem that scales only with the number of clusters, instead of the data set size. Finally, experiments on several real-world data sets support the previous analysis and demonstrate that BQSVOR improves the speed of SVOR significantly with guaranteed accuracy.