Probabilistic latent semantic indexing
Proceedings of the 22nd annual international ACM SIGIR conference on Research and development in information retrieval
Latent semantic indexing: a probabilistic analysis
Journal of Computer and System Sciences - Special issue on the seventeenth ACM SIGACT-SIGMOD-SIGART symposium on principles of database systems
Multiresolution Gray-Scale and Rotation Invariant Texture Classification with Local Binary Patterns
IEEE Transactions on Pattern Analysis and Machine Intelligence
Kernel partial least squares regression in reproducing kernel hilbert space
The Journal of Machine Learning Research
Generalized Discriminant Analysis Using a Kernel Approach
Neural Computation
Transfer learning via dimensionality reduction
AAAI'08 Proceedings of the 23rd national conference on Artificial intelligence - Volume 2
Multilayer pLSA for multimodal image retrieval
Proceedings of the ACM International Conference on Image and Video Retrieval
Unsupervised Object Discovery: A Comparison
International Journal of Computer Vision
IEEE Transactions on Knowledge and Data Engineering
Transfer discriminant-analysis of canonical correlations for view-transfer action recognition
PCM'12 Proceedings of the 13th Pacific-Rim conference on Advances in Multimedia Information Processing
Multi-view classification with cross-view must-link and cannot-link side information
Knowledge-Based Systems
Hi-index | 0.00 |
We study the problem of multimodal dimensionality reduction assuming that data samples can be missing at training time, and not all data modalities may be present at application time. Maximum covariance analysis, as a generalization of PCA, has many desirable properties, but its application to practical problems is limited by its need for perfectly paired data. We overcome this limitation by a latent variable approach that allows working with weakly paired data and is still able to efficiently process large datasets using standard numerical routines. The resulting weakly paired maximum covariance analysis often finds better representations than alternative methods, as we show in two exemplary tasks: texture discrimination and transfer learning.