Nonlinear non-negative component analysis algorithms
IEEE Transactions on Image Processing
Graph embedding using an edge-based wave Kernel
SSPR&SPR'10 Proceedings of the 2010 joint IAPR international conference on Structural, syntactic, and statistical pattern recognition
One-class classification with gaussian processes
ACCV'10 Proceedings of the 10th Asian conference on Computer vision - Volume Part II
Cohort-based kernel visualisation with scatter matrices
Pattern Recognition
Geometrical interpretation and applications of membership functions with fuzzy rough sets
Fuzzy Sets and Systems
The dissimilarity space: Bridging structural and statistical pattern recognition
Pattern Recognition Letters
Pattern analysis with graphs: Parallel work at Bern and York
Pattern Recognition Letters
The dissimilarity representation for structural pattern recognition
CIARP'11 Proceedings of the 16th Iberoamerican Congress conference on Progress in Pattern Recognition, Image Analysis, Computer Vision, and Applications
Feature extraction using fuzzy maximum margin criterion
Neurocomputing
A novel maximum margin neighborhood preserving embedding for face recognition
Future Generation Computer Systems
A minimax probabilistic approach to feature transformation for multi-class data
Applied Soft Computing
ECCV'12 Proceedings of the 12th European conference on Computer Vision - Volume Part IV
Full length article: Regularization networks with indefinite kernels
Journal of Approximation Theory
Densifying Distance Spaces for Shape and Image Retrieval
Journal of Mathematical Imaging and Vision
Incremental slow feature analysis with indefinite kernel for online temporal video segmentation
ACCV'12 Proceedings of the 11th Asian conference on Computer Vision - Volume Part II
One-class classification with Gaussian processes
Pattern Recognition
Further results on dissimilarity spaces for hyperspectral images RF-CBIR
Pattern Recognition Letters
Structure of feature spaces related to fuzzy similarity relations as kernels
Fuzzy Sets and Systems
Hi-index | 0.14 |
Kernel methods are a class of well established and successful algorithms for pattern analysis thanks to their mathematical elegance and good performance. Numerous nonlinear extensions of pattern recognition techniques have been proposed so far based on the so-called kernel trick. The objective of this paper is twofold. First, we derive an additional kernel tool that is still missing, namely kernel quadratic discriminant (KQD). We discuss different formulations of KQD based on the regularized kernel Mahalanobis distance in both complete and class-related subspaces. Secondly, we propose suitable extensions of kernel linear and quadratic discriminants to indefinite kernels. We provide classifiers that are applicable to kernels defined by any symmetric similarity measure. This is important in practice because problem-suited proximity measures often violate the requirement of positive definiteness. As in the traditional case, KQD can be advantageous for data with unequal class spreads in the kernel-induced spaces, which cannot be well separated by a linear discriminant. We illustrate this on artificial and real data for both positive definite and indefinite kernels.