The theoretical analysis of FDA and applications
Pattern Recognition
Learning linear PCA with convex semi-definite programming
Pattern Recognition
Noise reduction and edge detection via kernel anisotropic diffusion
Pattern Recognition Letters
Accuracy of suboptimal solutions to kernel principal component analysis
Computational Optimization and Applications
Least squares one-class support vector machine
Pattern Recognition Letters
A regularized formulation for spectral clustering with pairwise constraints
IJCNN'09 Proceedings of the 2009 international joint conference on Neural Networks
On support vector regression machines with linguistic interpretation of the kernel matrix
Fuzzy Sets and Systems
Subset based least squares subspace regression in RKHS
Neurocomputing
Approximations of the standard principal components analysis and kernel PCA
Expert Systems with Applications: An International Journal
Learning Linear and Nonlinear PCA with Linear Programming
Neural Processing Letters
Computational Biology and Chemistry
Electricity Demand Forecasting: An Essential Tool for Power System Planning, Operation and Control
International Journal of Productivity Management and Assessment Technologies
Hi-index | 0.00 |
In this paper, we present a simple and straightforward primal-dual support vector machine formulation to the problem of principal component analysis (PCA) in dual variables. By considering a mapping to a high-dimensional feature space and application of the kernel trick (Mercer theorem), kernel PCA is obtained as introduced by Scholkopf et al. (2002). While least squares support vector machine classifiers have a natural link with the kernel Fisher discriminant analysis (minimizing the within class scatter around targets +1 and -1), for PCA analysis one can take the interpretation of a one-class modeling problem with zero target value around which one maximizes the variance. The score variables are interpreted as error variables within the problem formulation. In this way primal-dual constrained optimization problem interpretations to the linear and kernel PCA analysis are obtained in a similar style as for least square-support vector machine classifiers.