Texture Features for Browsing and Retrieval of Image Data
IEEE Transactions on Pattern Analysis and Machine Intelligence
Filtering for Texture Classification: A Comparative Study
IEEE Transactions on Pattern Analysis and Machine Intelligence
Rapid extraction of image texture by co-occurrence using a hybrid data structure
Computers & Geosciences
Multiresolution Gray-Scale and Rotation Invariant Texture Classification with Local Binary Patterns
IEEE Transactions on Pattern Analysis and Machine Intelligence
Texture Features and Learning Similarity
CVPR '96 Proceedings of the 1996 Conference on Computer Vision and Pattern Recognition (CVPR '96)
Rotation and scale invariant texture features using discrete wavelet packet transform
Pattern Recognition Letters
Gaussian MRF Rotation-Invariant Features for Image Classification
IEEE Transactions on Pattern Analysis and Machine Intelligence
Texture classification using Gabor wavelets based rotation invariant features
Pattern Recognition Letters
Analyzing Image Structure by Multidimensional Frequency Modulation
IEEE Transactions on Pattern Analysis and Machine Intelligence
Integrated active contours for texture segmentation
IEEE Transactions on Image Processing
Hi-index | 0.10 |
We have developed a novel method to derive scale information from quasi-stationary images, which relies on a rotation-guided multi-scale analysis of features derived from Gray-Level Co-occurrence Matrices (GLCM). Unlike other methods for multi-scale texture characterization, our method does not require rotation-invariant textural features, but instead uses orientation information derived from the image to constrain the algorithm. Our method computes GLCM textural features on a ''stencil'' that follows the local orientation field. It compares features obtained from a sliding window that scans the whole image with those present on a user-selected reference pattern. The method then calculates a similarity measure between textural features derived from the whole image and those derived from the reference pattern. By applying different affine transforms to the stencil used for sampling the reference pattern, we are able to measure the similarity between regions of the image and different dilated versions of the reference pattern, and hence perform a multi-resolution analysis of the image. For a given region of an image, our method is able to find the most likely scale. Therefore it can estimate the stationarity of the image in terms of scale, which has important applications to multipoint geostatistics (MPGS). We tested the method on the Brodatz textures database. Our novel multi-scale, rotation-guided algorithm derives scale information from quasi-stationary images. It extends Gray-Level Co-occurrence Matrices with variable size, oriented, image-sampling ''stencils'', and relies on similarity measures between reference patterns and the full image. It achieves successful applications to MPGS.