PAC-bayes bounds with data dependent priors

  • Authors:
  • Emilio Parrado-Hernández;Amiran Ambroladze;John Shawe-Taylor;Shiliang Sun

  • Affiliations:
  • Department of Signal Processing and Communications, University Carlos III of Madrid, Leganés, Spain;Department of Mathematics and Computer Science, Tbilisi Free University, Tbilisi, Georgia;Department of Computer Science, University College London, London, UK;Department of Computer Science and Technology, East China Normal University, Shanghai, China

  • Venue:
  • The Journal of Machine Learning Research
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper presents the prior PAC-Bayes bound and explores its capabilities as a tool to provide tight predictions of SVMs' generalization. The computation of the bound involves estimating a prior of the distribution of classifiers from the available data, and then manipulating this prior in the usual PAC-Bayes generalization bound. We explore two alternatives: to learn the prior from a separate data set, or to consider an expectation prior that does not need this separate data set. The prior PAC-Bayes bound motivates two SVM-like classification algorithms, prior SVM and ν-prior SVM, whose regularization term pushes towards the minimization of the prior PAC-Bayes bound. The experimental work illustrates that the new bounds can be significantly tighter than the original PAC-Bayes bound when applied to SVMs, and among them the combination of the prior PAC-Bayes bound and the prior SVM algorithm gives the tightest bound.