Self-organization and associative memory: 3rd edition
Self-organization and associative memory: 3rd edition
Learning mixture models of spatial coherence
Neural Computation
Independent component analysis, a new concept?
Signal Processing - Special issue on higher order statistics
Assessing the importance of features for multi-layer perceptrons
Neural Networks
Faithful representations with topographic maps
Neural Networks
Self-Organizing Maps
Learning from Examples with Information Theoretic Criteria
Journal of VLSI Signal Processing Systems
Neighborhood Preservation in Nonlinear Projection Methods: An Experimental Study
ICANN '01 Proceedings of the International Conference on Artificial Neural Networks
Analysis and visualization of gene expression data using self-organizing maps
Neural Networks - New developments in self-organizing maps
An introduction to variable and feature selection
The Journal of Machine Learning Research
Feature extraction by non parametric mutual information maximization
The Journal of Machine Learning Research
Information-Theoretic Competitive Learning with Inverse Euclidean Distance Output Units
Neural Processing Letters
Vector quantization using information theoretic concepts
Natural Computing: an international journal
Magnification Control in Self-Organizing Maps and Neural Gas
Neural Computation
Neural Computation
Neural Computation
Controlling the magnification factor of self-organizing feature maps
Neural Computation
Feature Discovery by Enhancement and Relaxation of Competitive Units
IDEAL '08 Proceedings of the 9th International Conference on Intelligent Data Engineering and Automated Learning
Enhancing and Relaxing Competitive Units for Feature Discovery
Neural Processing Letters
Visualization of topology representing networks
IDEAL'07 Proceedings of the 8th international conference on Intelligent data engineering and automated learning
Self-splitting competitive learning: a new on-line clustering paradigm
IEEE Transactions on Neural Networks
Branching competitive learning Network: A novel self-creating model
IEEE Transactions on Neural Networks
Advanced search algorithms for information-theoretic learning with kernel-based estimators
IEEE Transactions on Neural Networks
Survey of clustering algorithms
IEEE Transactions on Neural Networks
Cooperative information maximization with Gaussian activation functions for self-organizing maps
IEEE Transactions on Neural Networks
Explicit Magnification Control of Self-Organizing Maps for “Forbidden” Data
IEEE Transactions on Neural Networks
Rival penalized competitive learning for clustering analysis, RBF net, and curve detection
IEEE Transactions on Neural Networks
Using mutual information for selecting features in supervised neural net learning
IEEE Transactions on Neural Networks
Artificial neural networks for feature extraction and multivariate data projection
IEEE Transactions on Neural Networks
Hi-index | 0.01 |
In this paper, we propose a new information-theoretic method to simplify the computation of information and to unify several methods in one framework. The new method is called ''supposed maximum information,'' used to produce humanly comprehensible representations in competitive learning by taking into account the importance of input units. In the new learning method, by supposing the maximum information of input units, the actual information of input units is estimated. Then, the competitive network is trained with the estimated information in input units. The method is applied not to pure competitive learning, but to self-organizing maps, because it is easy to demonstrate visually how well the new method can produce more interpretable representations. We applied the method to three well-known sets of data, namely, the Kohonen animal data, the SPECT heart data and the voting data from the machine learning database. With these data, we succeeded in producing more explicit class boundaries on the U-matrices than did the conventional SOM. In addition, for all the data, quantization and topographic errors produced by our method were lower than those by the conventional SOM.