Neural networks and the bias/variance dilemma
Neural Computation
Independent component analysis, a new concept?
Signal Processing - Special issue on higher order statistics
The nature of statistical learning theory
The nature of statistical learning theory
Eigenfaces vs. Fisherfaces: Recognition Using Class Specific Linear Projection
IEEE Transactions on Pattern Analysis and Machine Intelligence
IEEE Transactions on Pattern Analysis and Machine Intelligence
Computer Vision: A Modern Approach
Computer Vision: A Modern Approach
Universal Analytical Forms for Modeling Image Probabilities
IEEE Transactions on Pattern Analysis and Machine Intelligence
Learning to recognize three-dimensional objects
Neural Computation
Optimal Linear Representations of Images for Object Recognition
IEEE Transactions on Pattern Analysis and Machine Intelligence
Journal of Cognitive Neuroscience
Two-stage optimal component analysis
Computer Vision and Image Understanding
Tools for application-driven linear dimension reduction
Neurocomputing
Resampling for face recognition
AVBPA'03 Proceedings of the 4th international conference on Audio- and video-based biometric person authentication
Face recognition system based on PCA and feedforward neural networks
IWANN'05 Proceedings of the 8th international conference on Artificial Neural Networks: computational Intelligence and Bioinspired Systems
Kernel methods for nonlinear discriminative data analysis
EMMCVPR'05 Proceedings of the 5th international conference on Energy Minimization Methods in Computer Vision and Pattern Recognition
Face recognition by independent component analysis
IEEE Transactions on Neural Networks
Face Recognition Using an Enhanced Independent Component Analysis Approach
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
Learning data representations is a fundamental challenge in modeling neural processes and plays an important role in applications such as object recognition. Optimal component analysis (OCA) formulates the problem in the framework of optimization on a Grassmann manifold and a stochastic gradient method is used to estimate the optimal basis. OCA has been successfully applied to image classification problems arising in a variety of contexts. However, as the search space is typically very high dimensional, OCA optimization often requires expensive computational cost. In multi-stage OCA, we first hierarchically project the data onto several low-dimensional subspaces using standard techniques, then OCA learning is performed hierarchically from the lowest to the highest levels to learn about a subspace that is optimal for data discrimination based on the K-nearest neighbor classifier. One of the main advantages of multi-stage OCA lies in the fact that it greatly improves the computational efficiency of the OCA learning algorithm without sacrificing the recognition performance, thus enhancing its applicability to practical problems. In addition to the nearest neighbor classifier, we illustrate the effectiveness of the learned representations on object classification used in conjunction with classifiers such as neural networks and support vector machines.