Normalized Cuts and Image Segmentation
IEEE Transactions on Pattern Analysis and Machine Intelligence
Document clustering based on non-negative matrix factorization
Proceedings of the 26th annual international ACM SIGIR conference on Research and development in informaion retrieval
Locality preserving indexing for document representation
Proceedings of the 27th annual international ACM SIGIR conference on Research and development in information retrieval
Face Recognition Using Laplacianfaces
IEEE Transactions on Pattern Analysis and Machine Intelligence
Matching Theory (North-Holland mathematics studies)
Matching Theory (North-Holland mathematics studies)
A New Solution Scheme of Unsupervised Locality Preserving Projection Method for the SSS Problem
SSPR & SPR '08 Proceedings of the 2008 Joint IAPR International Workshop on Structural, Syntactic, and Statistical Pattern Recognition
Pixel-based visualization and density-based tabular model
VIEW'06 Proceedings of the 1st first visual information expert conference on Pixelization paradigm
Global and local preserving feature extraction for image categorization
ICANN'07 Proceedings of the 17th international conference on Artificial neural networks
Short Communication: A novel local preserving projection scheme for use with face recognition
Expert Systems with Applications: An International Journal
LPP solution schemes for use with face recognition
Pattern Recognition
Locality preserving projection on source code metrics for improved software maintainability
AI'06 Proceedings of the 19th Australian joint conference on Artificial Intelligence: advances in Artificial Intelligence
Local maximal margin discriminant embedding for face recognition
Journal of Visual Communication and Image Representation
Hi-index | 0.00 |
Recently, several manifold learning algorithms have been proposed, such as ISOMAP (Tenenbaum et al., 2000), Locally Linear Embedding (Roweis & Saul, 2000), Laplacian Eigenmap (Belkin & Niyogi, 2001), Locality Preserving Projection (LPP) (He & Niyogi, 2003), etc. All of them aim at discovering the meaningful low dimensional structure of the data space. In this paper, we present a statistical analysis of the LPP algorithm. Different from Principal Component Analysis (PCA) which obtains a subspace spanned by the largest eigenvectors of the global covariance matrix, we show that LPP obtains a subspace spanned by the smallest eigenvectors of the local covariance matrix. We applied PCA and LPP to real world document clustering task. Experimental results show that the performance can be significantly improved in the subspace, and especially LPP works much better than PCA.