Elements of information theory
Elements of information theory
The nature of statistical learning theory
The nature of statistical learning theory
Eigenfaces vs. Fisherfaces: Recognition Using Class Specific Linear Projection
IEEE Transactions on Pattern Analysis and Machine Intelligence
Neural Network-Based Face Detection
IEEE Transactions on Pattern Analysis and Machine Intelligence
Nonlinear component analysis as a kernel eigenvalue problem
Neural Computation
Database-friendly random projections
PODS '01 Proceedings of the twentieth ACM SIGMOD-SIGACT-SIGART symposium on Principles of database systems
Random projection in dimensionality reduction: applications to image and text data
Proceedings of the seventh ACM SIGKDD international conference on Knowledge discovery and data mining
Detecting Faces in Images: A Survey
IEEE Transactions on Pattern Analysis and Machine Intelligence
Statistical Learning of Multi-view Face Detection
ECCV '02 Proceedings of the 7th European Conference on Computer Vision-Part IV
Learning a Sparse Representation for Object Detection
ECCV '02 Proceedings of the 7th European Conference on Computer Vision-Part IV
Training Support Vector Machines: an Application to Face Detection
CVPR '97 Proceedings of the 1997 Conference on Computer Vision and Pattern Recognition (CVPR '97)
Face Detection Using Mixtures of Linear Subspaces
FG '00 Proceedings of the Fourth IEEE International Conference on Automatic Face and Gesture Recognition 2000
Object Recognition with Informative Features and Linear Classification
ICCV '03 Proceedings of the Ninth IEEE International Conference on Computer Vision - Volume 2
IEEE Transactions on Pattern Analysis and Machine Intelligence
IEEE Transactions on Pattern Analysis and Machine Intelligence
Neural Networks: A Comprehensive Foundation (3rd Edition)
Neural Networks: A Comprehensive Foundation (3rd Edition)
Kernel pooled local subspaces for classification
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Hi-index | 0.02 |
We have developed an informative sample subspace (ISS) method that is suitable for projecting high-dimensional data onto a low-dimensional subspace for classification purposes. In this paper, we present an ISS algorithm that uses a maximal mutual information criterion to search a labelled training data set directly for the subspace's projection base vectors. We evaluate the usefulness of the ISS method using synthetic data as well as real world problems. Experimental results demonstrate that the ISS algorithm is effective and can be used as a general method for representing high-dimensional data in a low-dimensional subspace for classification.