Advances in neural information processing systems 2
Information Theoretic Clustering
IEEE Transactions on Pattern Analysis and Machine Intelligence
Feature extraction by non parametric mutual information maximization
The Journal of Machine Learning Research
Information-Theoretic Competitive Learning with Inverse Euclidean Distance Output Units
Neural Processing Letters
Vector quantization using information theoretic concepts
Natural Computing: an international journal
2005 Special Issue: Unifying cost and information in information-theoretic competitive learning
Neural Networks - 2005 Special issue: IJCNN 2005
Hi-index | 0.00 |
In this paper, we introduce a new concept of enhancement and relaxation to discover features in input patterns in competitive learning. We have introduced mutual information to realize competitive processes. Because mutual information is an average over all input patterns and competitive units, it cannot be used to detect detailed feature extraction. To examine in more detail how a network is organized, we introduce the enhancement and relaxation of competitive units through some elements in a network. With this procedure, we can estimate how the elements are organized with more detail. We applied the method to a simple artificial data and the famous Iris problem to show how well the method can extract the main features in input patterns. Experimental results showed that the method could more explicitly extract the main features in input patterns than the conventional techniques of the SOM.