Fundamentals of digital image processing
Fundamentals of digital image processing
What is the goal of sensory coding?
Neural Computation
A fast fixed-point algorithm for independent component analysis
Neural Computation
Independent component analysis: theory and applications
Independent component analysis: theory and applications
Unsupervised learning
IEEE Transactions on Pattern Analysis and Machine Intelligence
Machine Learning
Deterministic Annealing for Unsupervised Texture Segmentation
EMMCVPR '97 Proceedings of the First International Workshop on Energy Minimization Methods in Computer Vision and Pattern Recognition
Multiscale Annealing for Real-Time Unsupervised Texture Segmentation
Multiscale Annealing for Real-Time Unsupervised Texture Segmentation
Unsupervised image classification, segmentation, and enhancement using ICA mixture models
IEEE Transactions on Image Processing
Image denoising using scale mixtures of Gaussians in the wavelet domain
IEEE Transactions on Image Processing
Image quality assessment: from error visibility to structural similarity
IEEE Transactions on Image Processing
A general procedure for learning mixtures of independent component analyzers
Pattern Recognition
Discovering the visual patterns elicited by human scan-path
Proceedings of the 2010 conference on Biologically Inspired Cognitive Architectures 2010: Proceedings of the First Annual Meeting of the BICA Society
Hi-index | 0.01 |
Capturing dependencies in images in an unsupervised manner is important for many image-processing applications and for understanding the structure of natural image signals. Data generative linear models such as principal component analysis (PCA) and independent component analysis (ICA) have shown to capture low level features such as oriented edges in images. However, those models capture only linear dependency and therefore its modeling capability is limited. We propose a new method for capturing nonlinear dependencies in images of natural scenes. This method is an extension of the linear ICA method and builds on a hierarchical representation. The model makes use of lower level linear ICA representation and a subsequent mixture of Laplacian distribution for learning the nonlinear dependencies in an image. The model parameters are learned via the expectation maximization (EM) algorithm and it can accurately capture variance correlation and other high order structures in a simple and consistent manner. We visualize the learned variance correlation structure and demonstrate applications to automatic image segmentation and image denoising.