Unsupervised learning by probabilistic latent semantic analysis
Machine Learning
Modeling the Shape of the Scene: A Holistic Representation of the Spatial Envelope
International Journal of Computer Vision
Object Recognition from Local Scale-Invariant Features
ICCV '99 Proceedings of the International Conference on Computer Vision-Volume 2 - Volume 2
The Journal of Machine Learning Research
Non-negative Matrix Factorization with Sparseness Constraints
The Journal of Machine Learning Research
A Bayesian Hierarchical Model for Learning Natural Scene Categories
CVPR '05 Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05) - Volume 2 - Volume 02
Discovering Objects and their Localization in Images
ICCV '05 Proceedings of the Tenth IEEE International Conference on Computer Vision (ICCV'05) Volume 1 - Volume 01
Modeling Scenes with Local Descriptors and Latent Aspects
ICCV '05 Proceedings of the Tenth IEEE International Conference on Computer Vision (ICCV'05) Volume 1 - Volume 01
Learning Object Categories from Google"s Image Search
ICCV '05 Proceedings of the Tenth IEEE International Conference on Computer Vision - Volume 2
Using Multiple Segmentations to Discover Objects and their Extent in Image Collections
CVPR '06 Proceedings of the 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition - Volume 2
Pattern Recognition and Machine Learning (Information Science and Statistics)
Pattern Recognition and Machine Learning (Information Science and Statistics)
Unsupervised Learning of Human Action Categories Using Spatial-Temporal Words
International Journal of Computer Vision
Image classification for content-based indexing
IEEE Transactions on Image Processing
Hi-index | 0.01 |
We propose a novel hierarchical latent topic model based on sparse coding in this paper. Unlike the other topic models applied in the computer vision field, the words in our model are not discrete but continuous. They are generated by sparse coding and represented with n-dimensional vectors in R^n. In sparse coding, only a small set of components of each word is active, so we assume the probability distribution over these continuous words is Laplace and the parameters of the Laplace distribution depend on topics, which are the latent variables in this model. The relationship among word, topic, document and corpus in our model is similar to Latent Dirichlet Allocation (LDA). Thereby this model is a generalization of the traditional LDA by introducing the concept-continuous words. We use an EM algorithm to estimate the parameters in our model. And the proposed method is applied to some significant computer vision problems such as natural scene categorization and object classification. The experimental results show the method is a valuable direction to generalize topic models.