The nature of statistical learning theory
The nature of statistical learning theory
Efficient Pattern Recognition Using a New Transformation Distance
Advances in Neural Information Processing Systems 5, [NIPS Conference]
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
Discriminant Pattern Recognition Using Transformation-Invariant Neurons
Neural Computation
IEEE Transactions on Information Theory
Modeling the manifolds of images of handwritten digits
IEEE Transactions on Neural Networks
A Kernel Method for the Optimization of the Margin Distribution
ICANN '08 Proceedings of the 18th international conference on Artificial Neural Networks, Part I
Multi-prototype support vector machine
IJCAI'03 Proceedings of the 18th international joint conference on Artificial intelligence
A case study on the System for Paleographic Inspections (SPI): challenges and new developments
Proceedings of the 2009 conference on Computational Intelligence and Bioengineering: Essays in Memory of Antonina Starita
Hi-index | 0.00 |
We present a simple general scheme for improving margins that is inspired on well known margin theory principles. The scheme is based on a sample re-weighting strategy. The very basic idea is in fact to add to the training set new replicas of samples which are not classified with a sufficient margin. As a study case, we present a new algorithm, namely TVQ, which is an instance of the proposed scheme and involves a tangent distance based 1-NN classifier implementing a sort of quantization of the tangent distance prototypes. The tangent distance models created in this way have shown a significant improvement in generalization power with respect to standard tangent models. Moreover, the obtained models were able to outperform other state of the art algorithms, such as SVM, in an OCR task.