Feature selection via dependence maximization

  • Authors:
  • Le Song;Alex Smola;Arthur Gretton;Justin Bedo;Karsten Borgwardt

  • Affiliations:
  • Computational Science and Engineering, Georgia Institute of Technology, Atlanta, GA;Yahoo! Research, Santa Clara, CA;Gatsby Computational Neuroscience Unit, London, UK and Intelligent Systems Group, Max Planck Institutes, Tübingen, Germany;Statistical Machine Learning Program, National ICT Australia, Canberra, ACT, Australia and Australian National University, Canberra, ACT, Australia;Machine Learning and Computational Biology Research Group, Max Planck Institutes, Tübingen, Germany

  • Venue:
  • The Journal of Machine Learning Research
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

We introduce a framework for feature selection based on dependence maximization between the selected features and the labels of an estimation problem, using the Hilbert-Schmidt Independence Criterion. The key idea is that good features should be highly dependent on the labels. Our approach leads to a greedy procedure for feature selection. We show that a number of existing feature selectors are special cases of this framework. Experiments on both artificial and real-world data show that our feature selector works well in practice.