Principal Components Analysis (PCA)
Computers & Geosciences
Eigenfaces vs. Fisherfaces: Recognition Using Class Specific Linear Projection
IEEE Transactions on Pattern Analysis and Machine Intelligence
Combining labeled and unlabeled data with co-training
COLT' 98 Proceedings of the eleventh annual conference on Computational learning theory
Robust Texture Classification by Subsets of Local Binary Patterns
ICPR '00 Proceedings of the International Conference on Pattern Recognition - Volume 3
ICDM '04 Proceedings of the Fourth IEEE International Conference on Data Mining
Beyond the point cloud: from transductive to semi-supervised learning
ICML '05 Proceedings of the 22nd international conference on Machine learning
Manifold Regularization: A Geometric Framework for Learning from Labeled and Unlabeled Examples
The Journal of Machine Learning Research
Two-view feature generation model for semi-supervised learning
Proceedings of the 24th international conference on Machine learning
Journal of Cognitive Neuroscience
Semi-supervised Laplacian Regularization of Kernel Canonical Correlation Analysis
ECML PKDD '08 Proceedings of the 2008 European Conference on Machine Learning and Knowledge Discovery in Databases - Part I
Multi-view clustering via canonical correlation analysis
ICML '09 Proceedings of the 26th Annual International Conference on Machine Learning
Multiple view semi-supervised dimensionality reduction
Pattern Recognition
Manifold alignment without correspondence
IJCAI'09 Proceedings of the 21st international jont conference on Artifical intelligence
A New Canonical Correlation Analysis Algorithm with Local Discrimination
Neural Processing Letters
Multi-view kernel construction
Machine Learning
A new method of feature fusion and its application in image recognition
Pattern Recognition
Semi-Supervised Learning
SemiCCA: Efficient Semi-supervised Learning of Canonical Correlations
ICPR '10 Proceedings of the 2010 20th International Conference on Pattern Recognition
Learning Multi-modal Similarity
The Journal of Machine Learning Research
Matching samples of multiple views
Data Mining and Knowledge Discovery
Sparse canonical correlation analysis
Machine Learning
Hi-index | 0.00 |
Canonical correlation analysis (CCA) is a widely used technique for analyzing two datasets (two views of the same objects). However, CCA needs that the samples of the two views are fully-paired. Actually, we are often faced up with the semi-paired scenario where the number of available paired samples is limited and yet the number of unpaired samples is sufficient. For such a scenario, CCA is generally prone to overfitting and thus performs poorly, since its definition itself makes it only able to utilize those paired samples. To overcome such a shortcoming, several semi-paired variants of CCA have been proposed. However, unpaired samples in these methods are just used in the way of single-view leaning to capture individual views' structure information for regularizing CCA. Intuitively, using unpaired samples in the way of two-view learning should be more natural and more attractive since CCA itself is a two-view learning method. As a result, a novel CCAs semi-paired variant named Neighborhood Correlation Analysis (NeCA), which uses unpaired samples in the two-view learning way, is developed through incorporating between-view neighborhood relationships into CCA. The relationships are acquired through leveraging within-view neighborhood relationships of each view's all data (including paired and unpaired data) and between-view paired information. Thus, it can take more sufficient advantage of the unpaired samples and then mitigate overfitting effectively caused by the limited paired data. Promising experiments results on several popular multi-view datasets show its feasibility and effectiveness.