Entropy and margin maximization for structured output learning

  • Authors:
  • Patrick Pletscher;Cheng Soon Ong;Joachim M. Buhmann

  • Affiliations:
  • Department of Computer Science, ETH Zürich, Switzerland;Department of Computer Science, ETH Zürich, Switzerland;Department of Computer Science, ETH Zürich, Switzerland

  • Venue:
  • ECML PKDD'10 Proceedings of the 2010 European conference on Machine learning and knowledge discovery in databases: Part III
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

We consider the problem of training discriminative structured output predictors, such as conditional random fields (CRFs) and structured support vector machines (SSVMs). A generalized loss function is introduced, which jointly maximizes the entropy and the margin of the solution. The CRF and SSVM emerge as special cases of our framework. The probabilistic interpretation of large margin methods reveals insights about margin and slack rescaling. Furthermore, we derive the corresponding extensions for latent variable models, in which training operates on partially observed outputs. Experimental results for multiclass, linear-chain models and multiple instance learning demonstrate that the generalized loss can improve accuracy of the resulting classifiers.