A neural implementation of canonical correlation analysis
Neural Networks
Random projection in dimensionality reduction: applications to image and text data
Proceedings of the seventh ACM SIGKDD international conference on Knowledge discovery and data mining
Document clustering via adaptive subspace iteration
Proceedings of the 27th annual international ACM SIGIR conference on Research and development in information retrieval
Column-generation boosting methods for mixture of kernels
Proceedings of the tenth ACM SIGKDD international conference on Knowledge discovery and data mining
Learning a kernel matrix for nonlinear dimensionality reduction
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Multiple kernel learning, conic duality, and the SMO algorithm
ICML '04 Proceedings of the twenty-first international conference on Machine learning
IEEE Transactions on Pattern Analysis and Machine Intelligence
Learning a Mahalanobis Metric from Equivalence Constraints
The Journal of Machine Learning Research
Kernel Methods for Measuring Independence
The Journal of Machine Learning Research
An adaptive and dynamic dimensionality reduction method for high-dimensional indexing
The VLDB Journal — The International Journal on Very Large Data Bases
Negative Samples Analysis in Relevance Feedback
IEEE Transactions on Knowledge and Data Engineering
A transductive framework of distance metric learning by spectral dimensionality reduction
Proceedings of the 24th international conference on Machine learning
IEEE Transactions on Pattern Analysis and Machine Intelligence
Dimensionality Reduction of Clustered Data Sets
IEEE Transactions on Pattern Analysis and Machine Intelligence
IEEE Transactions on Pattern Analysis and Machine Intelligence
A unified framework for semi-supervised dimensionality reduction
Pattern Recognition
IEEE Transactions on Knowledge and Data Engineering
Semi-supervised Laplacian Regularization of Kernel Canonical Correlation Analysis
ECML PKDD '08 Proceedings of the 2008 European Conference on Machine Learning and Knowledge Discovery in Databases - Part I
Transferred Dimensionality Reduction
ECML PKDD '08 Proceedings of the European conference on Machine Learning and Knowledge Discovery in Databases - Part II
Rotational Linear Discriminant Analysis Technique for Dimensionality Reduction
IEEE Transactions on Knowledge and Data Engineering
Locality condensation: a new dimensionality reduction method for image retrieval
MM '08 Proceedings of the 16th ACM international conference on Multimedia
Stable local dimensionality reduction approaches
Pattern Recognition
Multi-view clustering via canonical correlation analysis
ICML '09 Proceedings of the 26th Annual International Conference on Machine Learning
Nonlinear Dimensionality Reduction with Local Spline Embedding
IEEE Transactions on Knowledge and Data Engineering
General Cost Models for Evaluating Dimensionality Reduction in High-Dimensional Spaces
IEEE Transactions on Knowledge and Data Engineering
Semi-supervised learning with very few labeled training examples
AAAI'07 Proceedings of the 22nd national conference on Artificial intelligence - Volume 1
Transfer learning via dimensionality reduction
AAAI'08 Proceedings of the 23rd national conference on Artificial intelligence - Volume 2
Fast Haar transform based feature extraction for face representation and recognition
IEEE Transactions on Information Forensics and Security
Learning Deep Architectures for AI
Foundations and Trends® in Machine Learning
Constrained Dimensionality Reduction Using a Mixed-Norm Penalty Function with Neural Networks
IEEE Transactions on Knowledge and Data Engineering
Sparse Multiple Kernel Learning for Signal Processing Applications
IEEE Transactions on Pattern Analysis and Machine Intelligence
Generalized KPCA by adaptive rules in feature space
International Journal of Computer Mathematics
Bregman Divergence-Based Regularization for Transfer Subspace Learning
IEEE Transactions on Knowledge and Data Engineering
IEEE Transactions on Knowledge and Data Engineering
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Summarizing tourist destinations by mining user-generated travelogues and photos
Computer Vision and Image Understanding
Manifold elastic net: a unified framework for sparse dimension reduction
Data Mining and Knowledge Discovery
Multiple feature hashing for real-time large scale near-duplicate video retrieval
MM '11 Proceedings of the 19th ACM international conference on Multimedia
Mining Semantic Correlation of Heterogeneous Multimedia Data for Cross-Media Retrieval
IEEE Transactions on Multimedia
Robust Tensor Analysis With L1-Norm
IEEE Transactions on Circuits and Systems for Video Technology
DAML: Domain Adaptation Metric Learning
IEEE Transactions on Image Processing
m-SNE: Multiview Stochastic Neighbor Embedding
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Self-taught dimensionality reduction on the high-dimensional small-sized data
Pattern Recognition
Large Margin Subspace Learning for feature selection
Pattern Recognition
Hybrid structure for robust dimensionality reduction
Neurocomputing
Hi-index | 0.01 |
In this paper, we propose a novel method named Mixed Kernel CCA (MKCCA) to achieve easy yet accurate implementation of dimensionality reduction. MKCCA consists of two major steps. First, the high dimensional data space is mapped into the reproducing kernel Hilbert space (RKHS) rather than the Hilbert space, with a mixture of kernels, i.e. a linear combination between a local kernel and a global kernel. Meanwhile, a uniform design for experiments with mixtures is also introduced for model selection. Second, in the new RKHS, Kernel CCA is further improved by performing Principal Component Analysis (PCA) followed by CCA for effective dimensionality reduction. We prove that MKCCA can actually be decomposed into two separate components, i.e. PCA and CCA, which can be used to better remove noises and tackle the issue of trivial learning existing in CCA or traditional Kernel CCA. After this, the proposed MKCCA can be implemented in multiple types of learning, such as multi-view learning, supervised learning, semi-supervised learning, and transfer learning, with the reduced data. We show its superiority over existing methods in different types of learning by extensive experimental results.