Linearized smooth additive classifiers

  • Authors:
  • Subhransu Maji

  • Affiliations:
  • Toyota Technological Institute at Chicago, Chicago, IL

  • Venue:
  • ECCV'12 Proceedings of the 12th international conference on Computer Vision - Volume Part I
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

We consider a framework for learning additive classifiers based on regularized empirical risk minimization, where the regularization favors "smooth" functions. We present representations of classifiers for which the optimization problem can be efficiently solved. The first family of such classifiers are derived from a penalized spline formulation due to Eilers and Marx, which is modified to enabled linearization. The second is a novel family of classifiers that are based on classes of orthogonal basis functions with othogonal derivatives. Both these families lead to explicit feature embeddings that can be used with off-the-shelf linear solvers such as LIBLINEAR to obtain additive classifiers. The proposed family of classifiers offer better trade-offs between training time, memory overhead and classifier accuracy, compared to the state-of-the-art in additive classifier training.