Importance-Weighted cross-validation for covariate shift

  • Authors:
  • Masashi Sugiyama;Benjamin Blankertz;Matthias Krauledat;Guido Dornhege;Klaus-Robert Müller

  • Affiliations:
  • Department of Computer Science, Tokyo Institute of Technology, Tokyo, Japan;Fraunhofer FIRST.IDA, Berlin, Germany;Fraunhofer FIRST.IDA, Berlin, Germany;Fraunhofer FIRST.IDA, Berlin, Germany;Fraunhofer FIRST.IDA, Berlin, Germany

  • Venue:
  • DAGM'06 Proceedings of the 28th conference on Pattern Recognition
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

A common assumption in supervised learning is that the input points in the training set follow the same probability distribution as the input points used for testing. However, this assumption is not satisfied, for example, when the outside of training region is extrapolated. The situation where the training input points and test input points follow different distributions is called the covariate shift. Under the covariate shift, standard machine learning techniques such as empirical risk minimization or cross-validation do not work well since their unbiasedness is no longer maintained. In this paper, we propose a new method called importance-weighted cross-validation, which is still unbiased even under the covariate shift. The usefulness of our proposed method is successfully tested on toy data and furthermore demonstrated in the brain-computer interface, where strong non-stationarity effects can be seen between calibration and feedback sessions.