Locality preserving nonnegative matrix factorization
IJCAI'09 Proceedings of the 21st international jont conference on Artifical intelligence
Transfer Discriminative Logmaps
PCM '09 Proceedings of the 10th Pacific Rim Conference on Multimedia: Advances in Multimedia Information Processing
Generalized discriminant analysis: a matrix exponential approach
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Entropy controlled Laplacian regularization for least square regression
Signal Processing
Active reranking for web image search
IEEE Transactions on Image Processing
Integrating global and local structures in semi-supervised discriminant analysis
IITA'09 Proceedings of the 3rd international conference on Intelligent information technology application
Semi-supervised learning with varifold Laplacians
Neurocomputing
Discriminant analysis via support vectors
Neurocomputing
A general learning framework using local and global regularization
Pattern Recognition
Evolutionary cross-domain discriminative hessian eigenmaps
IEEE Transactions on Image Processing
Semi-supervised sparse metric learning using alternating linearization optimization
Proceedings of the 16th ACM SIGKDD international conference on Knowledge discovery and data mining
Flexible manifold embedding: a framework for semi-supervised and unsupervised dimension reduction
IEEE Transactions on Image Processing
IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Manifold elastic net: a unified framework for sparse dimension reduction
Data Mining and Knowledge Discovery
Social image annotation via cross-domain subspace learning
Multimedia Tools and Applications
Hi-index | 0.00 |
In this paper, we study semi-supervised linear dimensionality reduction. Beyond conventional supervised methods which merely consider labeled instances, the semi-supervised scheme allows to leverage abundant and ample unlabeled instances into learning so as to achieve better generalization performance. Under semi-supervised settings, our objective is to learn a smooth as well as discriminative subspace and linear dimensionality reduction is thus achieved by mapping all samples into the subspace. Specifically, we present the Transductive Component Analysis (TCA) algorithm to generate such a subspace founded on a graph-theoretic framework.Considering TCA is non-orthogonal, we further present the Orthogonal Transductive Component Analysis (OTCA) algorithm to iteratively produce a series of orthogonal basis vectors. OTCA has better discriminating power than TCA. Experiments carried out on synthetic and real-world datasets by OTCA show a clear improvement over the results of representative dimensionality reduction algorithms.