Introduction to statistical pattern recognition (2nd ed.)
Introduction to statistical pattern recognition (2nd ed.)
Eigenfaces vs. Fisherfaces: Recognition Using Class Specific Linear Projection
IEEE Transactions on Pattern Analysis and Machine Intelligence
Nonlinear component analysis as a kernel eigenvalue problem
Neural Computation
IEEE Transactions on Pattern Analysis and Machine Intelligence
Learning from Data: Concepts, Theory, and Methods
Learning from Data: Concepts, Theory, and Methods
Feature Extraction, Construction and Selection: A Data Mining Perspective
Feature Extraction, Construction and Selection: A Data Mining Perspective
Machine Learning
Laplacian Eigenmaps for dimensionality reduction and data representation
Neural Computation
Feature Extraction Based on ICA for Binary Classification Problems
IEEE Transactions on Knowledge and Data Engineering
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
Linear Dimensionality Reduction via a Heteroscedastic Extension of LDA: The Chernoff Criterion
IEEE Transactions on Pattern Analysis and Machine Intelligence
A Unified Framework for Subspace Face Recognition
IEEE Transactions on Pattern Analysis and Machine Intelligence
IEEE Transactions on Pattern Analysis and Machine Intelligence
Incremental Online Learning in High Dimensions
Neural Computation
Generalized Discriminant Analysis Using a Kernel Approach
Neural Computation
Graph Embedding and Extensions: A General Framework for Dimensionality Reduction
IEEE Transactions on Pattern Analysis and Machine Intelligence
2D-LDA: A statistical linear discriminant analysis for image matrix
Pattern Recognition Letters
IEEE Transactions on Image Processing
Generalization of linear discriminant analysis using Lp-norm
Pattern Recognition Letters
Bayesian predictive kernel discriminant analysis
Pattern Recognition Letters
A Rayleigh-Ritz style method for large-scale discriminant analysis
Pattern Recognition
Hi-index | 0.01 |
In this paper, we propose a nonlinear feature extraction method for regression problems to reduce the dimensionality of the input space. Previously, a feature extraction method LDAr, a regressional version of the linear discriminant analysis, was proposed. In this paper, LDAr is generalized to a nonlinear discriminant analysis by using the so-called kernel trick. The basic idea is to map the input space into a high-dimensional feature space where the variables are nonlinear transformations of input variables. Then we try to maximize the ratio of distances of samples with large differences in the target value and those with small differences in the target value in the feature space. It is well known that the distribution of face images, under a perceivable variation in translation, rotation, and scaling, is highly nonlinear and the face alignment problem is a complex regression problem. We have applied the proposed method to various regression problems including face alignment problems and achieved better performances than those of conventional linear feature extraction methods.