Minimax entropy principle and its application to texture modeling
Neural Computation
Scale-Space Theory in Computer Vision
Scale-Space Theory in Computer Vision
Mean Shift Based Clustering in High Dimensions: A Texture Classification Example
ICCV '03 Proceedings of the Ninth IEEE International Conference on Computer Vision - Volume 2
IEEE Transactions on Pattern Analysis and Machine Intelligence
A Statistical Approach to Material Classification Using Image Patch Exemplars
IEEE Transactions on Pattern Analysis and Machine Intelligence
Unsupervised texture segmentation with nonparametric neighborhood statistics
ECCV'06 Proceedings of the 9th European conference on Computer Vision - Volume Part II
-SVD: An Algorithm for Designing Overcomplete Dictionaries for Sparse Representation
IEEE Transactions on Signal Processing
Object detection with feature stability over scale space
Journal of Visual Communication and Image Representation
Scale selection for supervised image segmentation
Image and Vision Computing
Hi-index | 0.00 |
We present an approach to multiscale image analysis. It hinges on an operative definition of texture that involves a "small region", where some (unknown) statistic is aggregated, and a "large region" within which it is stationary. At each point, multiple small and large regions co-exist at multiple scales, as image structures are pooled by the scaling and quantization process to form "textures" and then transitions between textures define again "structures." We present a technique to learn and agglomerate sparse bases at multiple scales. To do so efficiently, we propose an analysis of cluster statistics after a clustering step is performed, and a new clustering method with linear-time performance. In both cases, we can infer all the "small" and "large" regions at multiple scale in one shot.