Learning invariance from transformation sequences
Neural Computation
IEEE Transactions on Pattern Analysis and Machine Intelligence
Second Order Nonstationary Source Separation
Journal of VLSI Signal Processing Systems
Slow feature analysis: unsupervised learning of invariances
Neural Computation
Neural Networks: Tricks of the Trade, this book is an outgrowth of a 1996 NIPS workshop
Adaptive Blind Signal and Image Processing: Learning Algorithms and Applications
Adaptive Blind Signal and Image Processing: Learning Algorithms and Applications
Topographic Independent Component Analysis
Neural Computation
Blind separation of instantaneous mixtures of nonstationary sources
IEEE Transactions on Signal Processing
Image compression via joint statistical characterization in the wavelet domain
IEEE Transactions on Image Processing
Bayesian tree-structured image modeling using wavelet-domain hidden Markov models
IEEE Transactions on Image Processing
Unsupervised image classification, segmentation, and enhancement using ICA mixture models
IEEE Transactions on Image Processing
Topographic Product Models Applied to Natural Scene Statistics
Neural Computation
A Maximum-Likelihood Interpretation for Slow Feature Analysis
Neural Computation
Overcomplete topographic independent component analysis
Neurocomputing
Learning Natural Image Structure with a Horizontal Product Model
ICA '09 Proceedings of the 8th International Conference on Independent Component Analysis and Signal Separation
A primitive based generative model to infer timing information in unpartitioned handwriting data
IJCAI'07 Proceedings of the 20th international joint conference on Artifical intelligence
Numerical modeling of neural switch programming field using finite element analysis
ICCOMP'09 Proceedings of the WSEAES 13th international conference on Computers
WSEAS Transactions on Computers
Class specific redundancies in natural images: a theory of extrastriate visual processing
IJCNN'09 Proceedings of the 2009 international joint conference on Neural Networks
A two-layer ICA-like model estimated by score matching
ICANN'07 Proceedings of the 17th international conference on Artificial neural networks
UAI '09 Proceedings of the Twenty-Fifth Conference on Uncertainty in Artificial Intelligence
A two-layer model of natural stimuli estimated with score matching
Neural Computation
Journal of Mathematical Imaging and Vision
Experimental studies of visual models in automatic image annotation
HCII'11 Proceedings of the 14th international conference on Human-computer interaction: design and development approaches - Volume Part I
Learning Topographic Representations of Nature Images with Pairwise Cumulant
Neural Processing Letters
Computers and Electrical Engineering
The Journal of Machine Learning Research
Edge structure preserving image denoising using OAGSM/NC statistical model
Digital Signal Processing
Hierarchical k-means algorithm for modeling visual area v2 neurons
ICONIP'12 Proceedings of the 19th international conference on Neural Information Processing - Volume Part III
Improved sparse coding under the influence of perceptual attention
Neural Computation
Hi-index | 0.00 |
Capturing statistical regularities in complex, high-dimensional data is an important problem in machine learning and signal processing. Models such as principal component analysis (PCA) and independent component analysis (ICA) make few assumptions about the structure in the data and have good scaling properties, but they are limited to representing linear statistical regularities and assume that the distribution of the data is stationary. For many natural, complex signals, the latent variables often exhibit residual dependencies as well as nonstationary statistics. Here we present a hierarchical Bayesian model that is able to capture higher-order nonlinear structure and represent nonstationary data distributions. The model is a generalization of ICA in which the basis function coefficients are no longer assumed to be independent; instead, the dependencies in their magnitudes are captured by a set of density components. Each density component describes a common pattern of deviation from the marginal density of the pattern ensemble; in different combinations, they can describe nonstationary distributions. Adapting the model to image or audio data yields a nonlinear, distributed code for higher-order statistical regularities that reflect more abstract, invariant properties of the signal.