Modeling context as statistical dependence

  • Authors:
  • Sriharsha Veeramachaneni;Prateek Sarkar;George Nagy

  • Affiliations:
  • SRA Division, ITC-IRST, Povo, TN, Italy;Palo Alto Research Centr, Palo Alto, CA;ECSE Dept., Rensselaer Polytechnic Institute, Troy, NY

  • Venue:
  • CONTEXT'05 Proceedings of the 5th international conference on Modeling and Using Context
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

Theories of context in logic enable reasoning and deduction in contexts represented as formal objects. Such theories are not readily applicable to systems that learn by induction from a set of examples. Probabilistic graphical models already provide the tools to exploit context represented as statistical dependences, thereby providing a unified methodology to incorporate context information in learning and inference. Drawing on a case study from optical character recognition, we present the various types of dependences that can occur in pattern classification problems and how such dependences can be exploited to increase classification accuracy. Learning under different conditions require differing amounts and kinds of samples and different trade-offs between modeling error due to overly strict independence assumptions and estimation error of models that are too elaborate for the size of the available training set. With a series of examples based on frames of two patterns we show how each kind of dependence can be represented using graphical models and present examples from other disciplines where the particular dependence frequently occurs.