The nature of statistical learning theory
The nature of statistical learning theory
Geometry and invariance in kernel based methods
Advances in kernel methods
Mixtures of probabilistic principal component analyzers
Neural Computation
An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
A Tutorial on Support Vector Machines for Pattern Recognition
Data Mining and Knowledge Discovery
Choosing Multiple Parameters for Support Vector Machines
Machine Learning
On the Surprising Behavior of Distance Metrics in High Dimensional Spaces
ICDT '01 Proceedings of the 8th International Conference on Database Theory
An introduction to variable and feature selection
The Journal of Machine Learning Research
Subspace clustering for high dimensional data: a review
ACM SIGKDD Explorations Newsletter - Special issue on learning from imbalanced datasets
Distance-Preserving Projection of High-Dimensional Data for Nonlinear Dimensionality Reduction
IEEE Transactions on Pattern Analysis and Machine Intelligence
The Concentration of Fractional Distances
IEEE Transactions on Knowledge and Data Engineering
A survey of kernel and spectral methods for clustering
Pattern Recognition
Statistical pattern recognition in remote sensing
Pattern Recognition
ACM Transactions on Knowledge Discovery from Data (TKDD)
Nonlinear Dimensionality Reduction
Nonlinear Dimensionality Reduction
On the effects of dimensionality on data analysis with neural networks
IWANN '03 Proceedings of the 7th International Work-Conference on Artificial and Natural Neural Networks: Part II: Artificial Neural Nets Problem Solving Methods
SVM-based feature extraction for face recognition
Pattern Recognition
IEEE Transactions on Pattern Analysis and Machine Intelligence
Training of support vector machines with Mahalanobis kernels
ICANN'05 Proceedings of the 15th international conference on Artificial neural networks: formal models and their applications - Volume Part II
Intrinsic dimension estimation by maximum likelihood in isotropic probabilistic PCA
Pattern Recognition Letters
Support vector regression using mahalanobis kernels
ANNPR'06 Proceedings of the Second international conference on Artificial Neural Networks in Pattern Recognition
IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews
A comparison of methods for multiclass support vector machines
IEEE Transactions on Neural Networks
Efficient tuning of SVM hyperparameters using radius/margin bound and iterative algorithms
IEEE Transactions on Neural Networks
A Geometrical Method to Improve Performance of the Support Vector Machine
IEEE Transactions on Neural Networks
Weighted Mahalanobis Distance Kernels for Support Vector Machines
IEEE Transactions on Neural Networks
Hi-index | 0.01 |
The classification of high dimensional data with kernel methods is considered in this paper. Exploiting the emptiness property of high dimensional spaces, a kernel based on the Mahalanobis distance is proposed. The computation of the Mahalanobis distance requires the inversion of a covariance matrix. In high dimensional spaces, the estimated covariance matrix is ill-conditioned and its inversion is unstable or impossible. Using a parsimonious statistical model, namely the High Dimensional Discriminant Analysis model, the specific signal and noise subspaces are estimated for each considered class making the inverse of the class specific covariance matrix explicit and stable, leading to the definition of a parsimonious Mahalanobis kernel. A SVM based framework is used for selecting the hyperparameters of the parsimonious Mahalanobis kernel by optimizing the so-called radius-margin bound. Experimental results on three high dimensional data sets show that the proposed kernel is suitable for classifying high dimensional data, providing better classification accuracies than the conventional Gaussian kernel.