Path Kernels and Multiplicative Updates

  • Authors:
  • Eiji Takimoto;Manfred K. Warmuth

  • Affiliations:
  • -;-

  • Venue:
  • COLT '02 Proceedings of the 15th Annual Conference on Computational Learning Theory
  • Year:
  • 2002

Quantified Score

Hi-index 0.00

Visualization

Abstract

We consider a natural convolution kernel defined by a directed graph. Each edge contributes an input. The inputs along a path form a product and the products for all paths are summed. We also have a set of probabilities on the edges so that the outflow from each node is one. We then discuss multiplicative updates on these graphs where the prediction is essentially a kernel computation and the update contributes a factor to each edge. Now the total outflow out of each node is not one any more. However some clever algorithms re-normalize the weights on the paths so that the total outflow out of each node is one again. Finally we discuss the use of regular expressions for speeding up the kernel and re-normalization computation. In particular we rewrite the multiplicative algorithms that predict as well as the best pruning of a series parallel graph in terms of efficient kernel computations.