Structured learning with constrained conditional models

  • Authors:
  • Ming-Wei Chang;Lev Ratinov;Dan Roth

  • Affiliations:
  • Computer Science Department, University of Illinois at Urbana-Champaign, Urbana, USA;Computer Science Department, University of Illinois at Urbana-Champaign, Urbana, USA;Computer Science Department, University of Illinois at Urbana-Champaign, Urbana, USA

  • Venue:
  • Machine Learning
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

Making complex decisions in real world problems often involves assigning values to sets of interdependent variables where an expressive dependency structure among these can influence, or even dictate, what assignments are possible. Commonly used models typically ignore expressive dependencies since the traditional way of incorporating non-local dependencies is inefficient and hence leads to expensive training and inference.The contribution of this paper is two-fold. First, this paper presents Constrained Conditional Models (CCMs), a聽framework that augments linear models with declarative constraints as a way to support decisions in an expressive output space while maintaining modularity and tractability of training. The paper develops, analyzes and compares novel algorithms for CCMs based on Hidden Markov Models and Structured Perceptron. The proposed CCM framework is also compared to task-tailored models, such as semi-CRFs.Second, we propose CoDL, a聽constraint-driven learning algorithm, which makes use of constraints to guide semi-supervised learning. We provide theoretical justification for CoDL along with empirical results which show the advantage of using declarative constraints in the context of semi-supervised training of probabilistic models.