TaylorBoost: First and second-order boosting algorithms with explicit margin control

  • Authors:
  • M. J. Saberian;H. Masnadi-Shirazi;N. Vasconcelos

  • Affiliations:
  • Dept. of Electr. & Comput. Eng., Univ. of California, San Diego, CA, USA;Dept. of Electr. & Comput. Eng., Univ. of California, San Diego, CA, USA;Dept. of Electr. & Comput. Eng., Univ. of California, San Diego, CA, USA

  • Venue:
  • CVPR '11 Proceedings of the 2011 IEEE Conference on Computer Vision and Pattern Recognition
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

A new family of boosting algorithms, denoted Taylor-Boost, is proposed. It supports any combination of loss function and first or second order optimization, and includes classical algorithms such as AdaBoost, Gradient-Boost, or LogitBoost as special cases. Its restriction to the set of canonical losses makes it possible to have boosting algorithms with explicit margin control. A new large family of losses with this property, based on the set of cumulative distributions of zero mean random variables, is then proposed. A novel loss function in this family, the Laplace loss, is finally derived. The combination of this loss and second order TaylorBoost produces a boosting algorithm with explicit margin control.