Selection of relevant features and examples in machine learning
Artificial Intelligence - Special issue on relevance
Wrappers for feature subset selection
Artificial Intelligence - Special issue on relevance
Random projection in dimensionality reduction: applications to image and text data
Proceedings of the seventh ACM SIGKDD international conference on Knowledge discovery and data mining
Estimating the Support of a High-Dimensional Distribution
Neural Computation
Beyond Bags of Features: Spatial Pyramid Matching for Recognizing Natural Scene Categories
CVPR '06 Proceedings of the 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition - Volume 2
Kernel PCA for novelty detection
Pattern Recognition
Computer Vision and Image Understanding
Least squares one-class support vector machine
Pattern Recognition Letters
Expert Systems with Applications: An International Journal
An overview of statistical learning theory
IEEE Transactions on Neural Networks
Hi-index | 0.10 |
In this short note, we demonstrate the use of principal components analysis (PCA) for one-class support vector machine (one-class SVM) as a dimension reduction tool. However, unlike almost all other usage of PCA which extracts the eigenvectors associated with top eigenvalues as the projection directions, here it is the eigenvectors associated with small eigenvalues that are of interests, and in particular the null of the eigenspace, since the null space in fact characterizes the common features of the training samples. Image retrieval examples are used to illustrate the effectiveness of dimension reduction.