Multi-label dimensionality reduction via dependence maximization

  • Authors:
  • Yin Zhang;Zhi-Hua Zhou

  • Affiliations:
  • National Key Laboratory for Novel Software Technology, Nanjing University, Nanjing, China;National Key Laboratory for Novel Software Technology, Nanjing University, Nanjing, China

  • Venue:
  • AAAI'08 Proceedings of the 23rd national conference on Artificial intelligence - Volume 3
  • Year:
  • 2008

Quantified Score

Hi-index 0.00

Visualization

Abstract

Multi-label learning deals with data associated with multiple labels simultaneously. Like other machine learning and data mining tasks, multi-label learning also suffers from the curse of dimensionality. Although dimensionality reduction has been studied for many years, multi-label dimensionality reduction remains almost untouched. In this paper, we propose a multi-label dimensionality reduction method, MDDM, which attempts to project the original data into a lower-dimensional feature space maximizing the dependence between the original feature description and the associated class labels. Based on the Hilbert-Schmidt Independence Criterion, we derive a closed-form solution which enables the dimensionality reduction process to be efficient. Experiments validate the performance of MDDM.