Modeling and Segmentation of Noisy and Textured Images Using Gibbs Random Fields
IEEE Transactions on Pattern Analysis and Machine Intelligence
Simultaneous Parameter Estimation and Segmentation of Gibbs Random Fields Using Simulated Annealing
IEEE Transactions on Pattern Analysis and Machine Intelligence
An iterative Gibbsian technique for reconstruction of m-ary images
Pattern Recognition
Multiple Resolution Segmentation of Textured Images
IEEE Transactions on Pattern Analysis and Machine Intelligence
An Iterative Growing and Pruning Algorithm for Classification Tree Design
IEEE Transactions on Pattern Analysis and Machine Intelligence
International Journal of Computer Vision
Applications of universal context modeling to lossless compression of gray-scale images
IEEE Transactions on Image Processing
Segmentation of textured images using a multiresolution Gaussian autoregressive model
IEEE Transactions on Image Processing
IEEE Transactions on Image Processing
Fast and active texture segmentation based on orientation and local variance
Journal of Visual Communication and Image Representation
A segmentation method for images compressed by fuzzy transforms
Fuzzy Sets and Systems
Novel classification and segmentation techniques with application to remotely sensed images
Transactions on rough sets VII
Hi-index | 0.10 |
By incorporating the local statistics of an image, a semi-causal non-stationary autoregressive random field can be applied to a non-stationary image for segmentation. Because this non-stationary random field can provide a better description of the image texture than the stationary one, an image can be better segmented. Besides low-order dependence among pixels in image for above-mentioned texture random field, the paper also introduces high-order dependence as a new classification feature to recognize the real object. Entropy rate that depicts the high-order dependence feature can also be estimated by using random field model. The proposed technique is applied to extract urban areas from a Landsat image.