Relative information maximization and its application to the extraction of explicit class structure in SOM

  • Authors:
  • Ryotaro Kamimura

  • Affiliations:
  • IT Education Center, Tokai University, 1117 Kitakaname Hiratsuka Kanagawa 259-1292, Japan

  • Venue:
  • Neurocomputing
  • Year:
  • 2012

Quantified Score

Hi-index 0.01

Visualization

Abstract

In this paper, we propose a new type of information theoretic method to determine the appropriate quantity of information to be contained in neural networks. Though information theoretic methods have been extensively applied to neural networks, they have been concerned with information maximization and minimization. In the present paper, we point out the necessity of paying due attention to the content of the obtained information, or the quality of the information content. We should explore more exactly what kinds of information should be obtained in learning. We applied this idea to information theoretic competitive learning in which mutual information between competitive units and input patterns can be used to realize competitive processes. We do not maximize simply the mutual information but the relative information, namely, the ratio of mutual information between competitive units and input patterns to the total information in networks. By maximizing the relative information, we can produce total information in which the maximum mutual information is included. We applied this method to two data sets from the machine learning database, namely, the glass data and the musk problem. The experimental results are summarized by the following three points. First, the relative information could be maximized, meaning that the peak values of relative information could be obtained for both sets of data. Second, improved quantization and topographic errors were obtained by maximizing the relative information. Third, when the relative information was maximized, clearer class structures could be obtained in terms of the U-matrix and conditional mutual information.