Linear discriminant projection embedding based on patches alignment
Image and Vision Computing
Flexible manifold embedding: a framework for semi-supervised and unsupervised dimension reduction
IEEE Transactions on Image Processing
A survey of multilinear subspace learning for tensor data
Pattern Recognition
Feature selection for MAUC-oriented classification systems
Neurocomputing
Hi-index | 0.14 |
The success of bilinear subspace learning heavily depends on reducing correlations among features along rows and columns of the data matrices. In this work, we study the problem of rearranging elements within a matrix in order to maximize these correlations so that information redundancy in matrix data can be more extensively removed by existing bilinear subspace learning algorithms. An efficient iterative algorithm is proposed to tackle this essentially integer programming problem. In each step, the matrix structure is refined with a constrained Earth Mover's Distance procedure that incrementally rearranges matrices to become more similar to their low-rank approximations, which have high correlation among features along rows and columns. In addition, we present two extensions of the algorithm for conducting supervised bilinear subspace learning. Experiments in both unsupervised and supervised bilinear subspace learning demonstrate the effectiveness of our proposed algorithms in improving data compression performance and classification accuracy.