Principal Components Analysis (PCA)
Computers & Geosciences
Learning to remove Internet advertisements
Proceedings of the third annual conference on Autonomous Agents
Analyzing the effectiveness and applicability of co-training
Proceedings of the ninth international conference on Information and knowledge management
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
Kernel Methods for Pattern Analysis
Kernel Methods for Pattern Analysis
Meta-clustering of gene expression data and literature-based information
ACM SIGKDD Explorations Newsletter
Optimal multimodal fusion for multimedia data analysis
Proceedings of the 12th annual ACM international conference on Multimedia
Graph based multi-modality learning
Proceedings of the 13th annual ACM international conference on Multimedia
ICPR '06 Proceedings of the 18th International Conference on Pattern Recognition - Volume 03
Graph Embedding and Extensions: A General Framework for Dimensionality Reduction
IEEE Transactions on Pattern Analysis and Machine Intelligence
Manifold Regularization: A Geometric Framework for Learning from Labeled and Unlabeled Examples
The Journal of Machine Learning Research
Dimensionality Reduction of Multimodal Labeled Data by Local Fisher Discriminant Analysis
The Journal of Machine Learning Research
A unified framework for semi-supervised dimensionality reduction
Pattern Recognition
Semi-supervised Laplacian Regularization of Kernel Canonical Correlation Analysis
ECML PKDD '08 Proceedings of the 2008 European Conference on Machine Learning and Knowledge Discovery in Databases - Part I
A Novel Method of Combined Feature Extraction for Recognition
ICDM '08 Proceedings of the 2008 Eighth IEEE International Conference on Data Mining
Multi-view clustering via canonical correlation analysis
ICML '09 Proceedings of the 26th Annual International Conference on Machine Learning
Discriminative Canonical Correlation Analysis with Missing Samples
CSIE '09 Proceedings of the 2009 WRI World Congress on Computer Science and Information Engineering - Volume 06
Multiple view semi-supervised dimensionality reduction
Pattern Recognition
Hierarchical Multi-view Fisher Discriminant Analysis
ICONIP '09 Proceedings of the 16th International Conference on Neural Information Processing: Part II
A New Canonical Correlation Analysis Algorithm with Local Discrimination
Neural Processing Letters
Subset based least squares subspace regression in RKHS
Neurocomputing
Semi-Supervised Learning
Exploiting tag and word correlations for improved webpage clustering
SMUC '10 Proceedings of the 2nd international workshop on Search and mining user-generated contents
ECCV'10 Proceedings of the 11th European conference on Computer vision: Part II
Canonical correlation analysis for multiview semisupervised feature extraction
ICAISC'10 Proceedings of the 10th international conference on Artificial intelligence and soft computing: Part I
SemiCCA: Efficient Semi-supervised Learning of Canonical Correlations
ICPR '10 Proceedings of the 2010 20th International Conference on Pattern Recognition
Hi-index | 0.01 |
Canonical correlation analysis (CCA) is a popular and powerful dimensionality reduction method to analyze paired multi-view data. However, when facing semi-paired and semi-supervised multi-view data which widely exist in real-world problems, CCA usually performs poorly due to its requirement of data pairing between different views and un-supervision in nature. Recently, several extensions of CCA have been proposed, however, they just handle the semi-paired scenario by utilizing structure information in each view or just deal with semi-supervised scenario by incorporating the discriminant information. In this paper, we present a general dimensionality reduction framework for semi-paired and semi-supervised multi-view data which naturally generalizes existing related works by using different kinds of prior information. Based on the framework, we develop a novel dimensionality reduction method, termed as semi-paired and semi-supervised generalized correlation analysis (S^2GCA). S^2GCA exploits a small amount of paired data to perform CCA and at the same time, utilizes both the global structural information captured from the unlabeled data and the local discriminative information captured from the limited labeled data to compensate the limited pairedness. Consequently, S^2GCA can find the directions which make not only maximal correlation between the paired data but also maximal separability of the labeled data. Experimental results on artificial and four real-world datasets show its effectiveness compared to the existing related dimensionality reduction methods.