Nonlinear component analysis as a kernel eigenvalue problem
Neural Computation
Duality and Geometry in SVM Classifiers
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Support Vector Data Description
Machine Learning
Kernel Methods for Pattern Analysis
Kernel Methods for Pattern Analysis
Distinctive Image Features from Scale-Invariant Keypoints
International Journal of Computer Vision
Discriminative Common Vectors for Face Recognition
IEEE Transactions on Pattern Analysis and Machine Intelligence
A Maximum Entropy Framework for Part-Based Texture and Object Recognition
ICCV '05 Proceedings of the Tenth IEEE International Conference on Computer Vision (ICCV'05) Volume 1 - Volume 01
Learning Nonlinear Image Manifolds by Global Alignment of Local Linear Models
IEEE Transactions on Pattern Analysis and Machine Intelligence
Pattern classification via single spheres
DS'05 Proceedings of the 8th international conference on Discovery Science
Modeling the manifolds of images of handwritten digits
IEEE Transactions on Neural Networks
Large margin classifiers based on affine hulls
Neurocomputing
Hyperdisk based large margin classifier
Pattern Recognition
Hi-index | 0.00 |
In high-dimensional classification problems it is infeasible to include enough training samples to cover the class regions densely. Irregularities in the resulting sparse sample distributions cause local classifiers such as Nearest Neighbors (NN) and kernel methods to have irregular decision boundaries. One solution is to "fill in the holes" by building a convex model of the region spanned by the training samples of each class and classifying examples based on their distances to these approximate models. Methods of this kind based on affine and convex hulls and bounding hyperspheres have already been studied. Here we propose a method based on the bounding hyperdisk of each class - the intersection of the affine hull and the smallest bounding hypersphere of its training samples. We argue that in many cases hyperdisks are preferable to affine and convex hulls and hyperspheres: they bound the classes more tightly than affine hulls or hyperspheres while avoiding much of the sample overfitting and computational complexity that is inherent in high-dimensional convex hulls. We show that the hyperdisk method can be kernelized to provide nonlinear classifiers based on non-Euclidean distance metrics. Experiments on several classification problems show promising results.