Fundamentals of speech recognition
Fundamentals of speech recognition
From data distributions to regularization in invariant learning
Neural Computation
Geometry and invariance in kernel based methods
Advances in kernel methods
Prior knowledge in support vector kernels
NIPS '97 Proceedings of the 1997 conference on Advances in neural information processing systems 10
Multiresolution tangent distance for affine-invariant classification
NIPS '97 Proceedings of the 1997 conference on Advances in neural information processing systems 10
AI Game Programming Wisdom
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Training Invariant Support Vector Machines
Machine Learning
Similarity and Affine Invariant Distances Between 2D Point Sets
IEEE Transactions on Pattern Analysis and Machine Intelligence
Proceedings of the Second Joint European - US Workshop on Applications of Invariance in Computer Vision
Efficient Pattern Recognition Using a New Transformation Distance
Advances in Neural Information Processing Systems 5, [NIPS Conference]
Transformation Invariance in Pattern Recognition-Tangent Distance and Tangent Propagation
Neural Networks: Tricks of the Trade, this book is an outgrowth of a 1996 NIPS workshop
Incorporating Invariances in Support Vector Learning Machines
ICANN 96 Proceedings of the 1996 International Conference on Artificial Neural Networks
On-Line Handwriting Recognition with Support Vector Machines " A Kernel Approach
IWFHR '02 Proceedings of the Eighth International Workshop on Frontiers in Handwriting Recognition (IWFHR'02)
Text classification using string kernels
The Journal of Machine Learning Research
Classes of kernels for machine learning: a statistics perspective
The Journal of Machine Learning Research
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
Kernel Methods for Pattern Analysis
Kernel Methods for Pattern Analysis
Learning with non-positive kernels
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Tangent Vector Kernels for Invariant Image Classification with SVMs
ICPR '04 Proceedings of the Pattern Recognition, 17th International Conference on (ICPR'04) Volume 3 - Volume 03
Local Context in Non-Linear Deformation Models for Handwritten Character Recognition
ICPR '04 Proceedings of the Pattern Recognition, 17th International Conference on (ICPR'04) Volume 4 - Volume 04
Feature Space Interpretation of SVMs with Indefinite Kernels
IEEE Transactions on Pattern Analysis and Machine Intelligence
Neural Computation
Using transformation knowledge for the classification of Raman spectra of biological samples
BioMed'06 Proceedings of the 24th IASTED international conference on Biomedical engineering
Joint manifold distance: a new approach to appearance based clustering
CVPR'03 Proceedings of the 2003 IEEE computer society conference on Computer vision and pattern recognition
Invariance in kernel methods by haar-integration kernels
SCIA'05 Proceedings of the 14th Scandinavian conference on Image Analysis
Kernel Methods in Computer Vision
Foundations and Trends® in Computer Graphics and Vision
A Combine-Correct-Combine Scheme for Optimizing Dissimilarity-Based Classifiers
CIARP '09 Proceedings of the 14th Iberoamerican Conference on Pattern Recognition: Progress in Pattern Recognition, Image Analysis, Computer Vision, and Applications
The dissimilarity space: Bridging structural and statistical pattern recognition
Pattern Recognition Letters
The dissimilarity representation for structural pattern recognition
CIARP'11 Proceedings of the 16th Iberoamerican Congress conference on Progress in Pattern Recognition, Image Analysis, Computer Vision, and Applications
ECCV'12 Proceedings of the 12th European conference on Computer Vision - Volume Part IV
Rotation-Invariant HOG Descriptors Using Fourier Analysis in Polar and Spherical Coordinates
International Journal of Computer Vision
Hi-index | 0.00 |
In many learning problems prior knowledge about pattern variations can be formalized and beneficially incorporated into the analysis system. The corresponding notion of invariance is commonly used in conceptionally different ways. We propose a more distinguishing treatment in particular in the active field of kernel methods for machine learning and pattern analysis. Additionally, the fundamental relation of invariant kernels and traditional invariant pattern analysis by means of invariant representations will be clarified. After addressing these conceptional questions, we focus on practical aspects and present two generic approaches for constructing invariant kernels. The first approach is based on a technique called invariant integration. The second approach builds on invariant distances. In principle, our approaches support general transformations in particular covering discrete and non-group or even an infinite number of pattern-transformations. Additionally, both enable a smooth interpolation between invariant and non-invariant pattern analysis, i.e. they are a covering general framework. The wide applicability and various possible benefits of invariant kernels are demonstrated in different kernel methods.