Self-Organizing Maps
Learning from Examples with Information Theoretic Criteria
Journal of VLSI Signal Processing Systems
An introduction to variable and feature selection
The Journal of Machine Learning Research
Feature extraction by non parametric mutual information maximization
The Journal of Machine Learning Research
Information-Theoretic Competitive Learning with Inverse Euclidean Distance Output Units
Neural Processing Letters
Survey of clustering algorithms
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
In this paper, we propose a new information-theoretic method to simplify and unify learning methods in one framework. The new method is called "supposed maximum information," which is used to produce humanly comprehensible internal representations supposing that information is already maximized before learning. The new learning method is composed of three stages. First, without information on input variables, a competitive network is trained. Second, with supposed maximum information on input variables, the importance of the variables is estimated by measuring mutual information between competitive units and input patterns. Finally, with the estimated importance of input variables, the competitive network is retrained to take into account the importance of input variables. The method is applied not to pure competitive learning but to self-organizing maps, because it is easy to demonstrate how well the new method can produce more explicit internal representation intuitively. We applied the method to the well-known SPECT heart data of the machine learning database. We succeeded in producing more comprehensible class boundaries on the U-matrices than did the conventional SOM. In addition, quantization and topographic errors produced by our method were not larger than those by the conventional SOM.