Texture classification by modeling joint distributions of local patterns with Gaussian mixtures

  • Authors:
  • Henning Lategahn;Sebastian Gross;Thomas Stehle;Til Aach

  • Affiliations:
  • Institute of Measurement and Control, Karlsruhe Institute of Technology, Karlsruhe, Germany and Institute of Imaging and Computer Vision, RWTH Aachen University, Germany;Institute of Imaging and Computer Vision, RWTH Aachen University, Aachen, Germany;Institute of Imaging and Computer Vision, RWTH Aachen University, Aachen, Germany;Institute of Imaging and Computer Vision, RWTH Aachen University, Aachen, Germany

  • Venue:
  • IEEE Transactions on Image Processing
  • Year:
  • 2010

Quantified Score

Hi-index 0.01

Visualization

Abstract

Texture classification generally requires the analysis of patterns in local pixel neighborhoods. Statistically, the underlying processes are comprehensively described by their joint probability density functions (jPDFs). Even for small neighborhoods, however, stable estimation of jPDFs by joint histograms (jHSTs) is often infeasible, since the number of entries in the jHST exceeds by far the number of pixels in a typical texture region. Moreover, evaluation of distance functions between jHSTs is often computationally prohibitive. Practically, the number of entries in a jHST is therefore reduced by considering only two-pixel patterns, leading to 2D-jHSTs known as cooccurrence matrices, or by quantization of the gray levels in local patterns to only two gray levels, yielding local binary patterns (LBPs). Both approaches result in a loss of information. We introduce here a framework for supervised texture classification which reduces or avoids this information loss. Local texture neighborhoods are first filtered by a filter bank. Without further quantization, the jPDF of the filter responses is then described parametrically by Gaussian mixture models (GMMs). We show that the parameters of the GMMs can be reliably estimated from small image regions. Moreover, distances between the thus modelled jPDFs of different texture patterns can be computed efficiently in closed form from their model parameters.We furthermore extend this texture descriptor to achieve full invariance to rotation. We evaluate the framework for different filter banks on the Brodatz texture set. We first show that combining the LBP difference filters with the GMM-based density estimator outperforms the classical LBP approach and its codebook extensions. When replacing these--rather elementary-- difference filters by the wavelet frame transform (WFT), the performance of the framework on all 111 Brodatz textures exceeds the one obtained more recently by spin image and RIFT descriptors by Lazebnik et al.