Self-organization and associative memory: 3rd edition
Self-organization and associative memory: 3rd edition
Global inhibition for selecting modes of attention
Neural Networks
A neural global workspace model for conscious attention
Neural Networks - 1997 special issue on neural networks for consciousness
Faithful representations with topographic maps
Neural Networks
Information Theoretic Clustering
IEEE Transactions on Pattern Analysis and Machine Intelligence
Self-Organizing Maps
Learning from Examples with Information Theoretic Criteria
Journal of VLSI Signal Processing Systems
Fast Evolutionary Learning with Batch-Type Self-Organizing Maps
Neural Processing Letters
Neighborhood Preservation in Nonlinear Projection Methods: An Experimental Study
ICANN '01 Proceedings of the International Conference on Artificial Neural Networks
Analysis and visualization of gene expression data using self-organizing maps
Neural Networks - New developments in self-organizing maps
Kernel-based topographic map formation achieved with an information-theoretic approach
Neural Networks - New developments in self-organizing maps
Generalized relevance learning vector quantization
Neural Networks - New developments in self-organizing maps
Information-Theoretic Competitive Learning with Inverse Euclidean Distance Output Units
Neural Processing Letters
Lower and Upper Bounds for Misclassification Probability Based on Renyi's Information
Journal of VLSI Signal Processing Systems
Neural maps in remote sensing image analysis
Neural Networks - 2003 Special issue: Neural network analysis of complex scientific data: Astronomy and geosciences
A model of active visual search with object-based attention guiding scan paths
Neural Networks - 2004 Special issue Vision and brain
Vector quantization using information theoretic concepts
Natural Computing: an international journal
Magnification Control in Self-Organizing Maps and Neural Gas
Neural Computation
2005 Special Issue: The interaction of attention and emotion
Neural Networks - Special issue: Emotion and brain
Advanced visualization of self-organizing maps with vector fields
Neural Networks - 2006 Special issue: Advances in self-organizing maps--WSOM'05
2006 Special Issue: Attention as a controller
Neural Networks
Magnification control for batch neural gas
Neurocomputing
Neural Computation
Neural Computation
Controlling the magnification factor of self-organizing feature maps
Neural Computation
Distance learning in discriminative vector quantization
Neural Computation
Exploiting data topology in visualization and clustering of self-organizing maps
IEEE Transactions on Neural Networks
Adaptive relevance matrices in learning vector quantization
Neural Computation
Visualization of topology representing networks
IDEAL'07 Proceedings of the 8th international conference on Intelligent data engineering and automated learning
Regularization in matrix relevance learning
IEEE Transactions on Neural Networks
Topology preservation in self-organizing feature maps: exact definition and measurement
IEEE Transactions on Neural Networks
Entropy-based kernel mixture modeling for topographic map formation
IEEE Transactions on Neural Networks
Advanced search algorithms for information-theoretic learning with kernel-based estimators
IEEE Transactions on Neural Networks
Feature selection in MLPs and SVMs based on maximum output information
IEEE Transactions on Neural Networks
Survey of clustering algorithms
IEEE Transactions on Neural Networks
Quantifying the neighborhood preservation of self-organizing feature maps
IEEE Transactions on Neural Networks
Explicit Magnification Control of Self-Organizing Maps for “Forbidden” Data
IEEE Transactions on Neural Networks
Artificial neural networks for feature extraction and multivariate data projection
IEEE Transactions on Neural Networks
Hi-index | 0.01 |
In this paper, we propose a new type of information theoretic method to determine the appropriate quantity of information to be contained in neural networks. Though information theoretic methods have been extensively applied to neural networks, they have been concerned with information maximization and minimization. In the present paper, we point out the necessity of paying due attention to the content of the obtained information, or the quality of the information content. We should explore more exactly what kinds of information should be obtained in learning. We applied this idea to information theoretic competitive learning in which mutual information between competitive units and input patterns can be used to realize competitive processes. We do not maximize simply the mutual information but the relative information, namely, the ratio of mutual information between competitive units and input patterns to the total information in networks. By maximizing the relative information, we can produce total information in which the maximum mutual information is included. We applied this method to two data sets from the machine learning database, namely, the glass data and the musk problem. The experimental results are summarized by the following three points. First, the relative information could be maximized, meaning that the peak values of relative information could be obtained for both sets of data. Second, improved quantization and topographic errors were obtained by maximizing the relative information. Third, when the relative information was maximized, clearer class structures could be obtained in terms of the U-matrix and conditional mutual information.