Derivations of normalized mutual information in binary classifications

  • Authors:
  • Yong Wang;Bao-Gang Hu

  • Affiliations:
  • Beijing Graduate School, Chinese Academy of Sciences, Beijing;Beijing Graduate School, Chinese Academy of Sciences, Beijing and National Laboratory of Pattern Recognition Institute of Automation, Chinese Academy of Sciences, Beijing

  • Venue:
  • FSKD'09 Proceedings of the 6th international conference on Fuzzy systems and knowledge discovery - Volume 1
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

Although the conventional performance indexes, such as accuracy, are commonly used in classifier selection or evaluation, information-based criteria, such as mutual information, are becoming popular in feature/model selections. In this work, we analyze the classifier learning model with the maximization normalized mutual information (NI) criterion, which is novel and well defined in a compact range for classifier evaluation. We derive close-form relations of normalized mutual information with respect to accuracy, precision, and recall in binary classifications. By exploring the relations among them, we reveal that NI is actually a set of nonlinear functions, with a concordant power-exponent form, to each performance index. The relations can also be expressed with respect to precision and recall, or to false alarm and hitting rate (recall).