Online multiclass learning by interclass hypothesis sharing

  • Authors:
  • Michael Fink;Shai Shalev-Shwartz;Yoram Singer;Shimon Ullman

  • Affiliations:
  • The Hebrew University, Jerusalem, Israel;The Hebrew University, Jerusalem, Israel;Google Inc., Mountain View CA;Weizmann Institute, Rehovot, Israel

  • Venue:
  • ICML '06 Proceedings of the 23rd international conference on Machine learning
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

We describe a general framework for online multiclass learning based on the notion of hypothesis sharing. In our framework sets of classes are associated with hypotheses. Thus, all classes within a given set share the same hypothesis. This framework includes as special cases commonly used constructions for multiclass categorization such as allocating a unique hypothesis for each class and allocating a single common hypothesis for all classes. We generalize the multiclass Perceptron to our framework and derive a unifying mistake bound analysis. Our construction naturally extends to settings where the number of classes is not known in advance but, rather, is revealed along the online learning process. We demonstrate the merits of our approach by comparing it to previous methods on both synthetic and natural datasets.