Self-organization and associative memory: 3rd edition
Self-organization and associative memory: 3rd edition
Introduction to the theory of neural computation
Introduction to the theory of neural computation
Independent component analysis, a new concept?
Signal Processing - Special issue on higher order statistics
Neural Networks for Modelling and Control of Dynamic Systems: A Practitioner's Handbook
Neural Networks for Modelling and Control of Dynamic Systems: A Practitioner's Handbook
Learning generative models of natural images
Neural Networks
Natural discriminant analysis using interactive Potts models
Neural Computation
Blind separation of convolutive image mixtures
Neurocomputing
Self organized mapping of data clusters to neuron groups
Neural Networks
Linear and nonlinear projective nonnegative matrix factorization
IEEE Transactions on Neural Networks
Correlation Matching Approaches for Blind OSTBC Channel Estimation
IEEE Transactions on Signal Processing
On the efficient evaluation of probabilistic similarity functions for image retrieval
IEEE Transactions on Information Theory
Independent component analysis using Potts models
IEEE Transactions on Neural Networks
Function approximation using generalized adalines
IEEE Transactions on Neural Networks
Multilayer Potts Perceptrons With Levenberg–Marquardt Learning
IEEE Transactions on Neural Networks
Training feedforward networks with the Marquardt algorithm
IEEE Transactions on Neural Networks
Hi-index | 0.01 |
This work explores learning LCGM (lattice-connected Gaussian mixture) models by annealed Kullback-Leibler (KL) divergence minimization for a hybrid of topological and statistical pattern analysis. The KL divergence measures the general criteria of learning an LCGM model that is composed of a lattice of multivariate Gaussian units. A planar lattice emulates topological order of cortex-like neighboring relations and built-in parameters of connected Gaussian units represent statistical features of unsupervised data. Learning an LCGM model involves collateral optimization tasks of resolving mixture combinatorics and extracting geometric features from high-dimensional patterns. Under assumption that mixture combinatorics encoded by Potts variables obey the Boltzmann distribution, approximating their joint probability by the product of individual probabilities is qualified by the KL divergence whose minimization under physical-like deterministic annealing faithfully optimizes involved mixture combinatorics and geometric features. Numerical simulations show the proposed annealed KL divergence minimization is effective and reliable for solving generalized TSP, spot identification, self-organization and visualization and sorting of yeast gene expressions.