Multi-label learning by exploiting label dependency

  • Authors:
  • Min-Ling Zhang;Kun Zhang

  • Affiliations:
  • Southeast University, Nanjing, China;Max Planck Institute for Biological Cybernetics, Tubingen, Germany

  • Venue:
  • Proceedings of the 16th ACM SIGKDD international conference on Knowledge discovery and data mining
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

In multi-label learning, each training example is associated with a set of labels and the task is to predict the proper label set for the unseen example. Due to the tremendous (exponential) number of possible label sets, the task of learning from multi-label examples is rather challenging. Therefore, the key to successful multi-label learning is how to effectively exploit correlations between different labels to facilitate the learning process. In this paper, we propose to use a Bayesian network structure to efficiently encode the conditional dependencies of the labels as well as the feature set, with the feature set as the common parent of all labels. To make it practical, we give an approximate yet efficient procedure to find such a network structure. With the help of this network, multi-label learning is decomposed into a series of single-label classification problems, where a classifier is constructed for each label by incorporating its parental labels as additional features. Label sets of unseen examples are predicted recursively according to the label ordering given by the network. Extensive experiments on a broad range of data sets validate the effectiveness of our approach against other well-established methods.