A training algorithm for optimal margin classifiers
COLT '92 Proceedings of the fifth annual workshop on Computational learning theory
The nature of statistical learning theory
The nature of statistical learning theory
From data distributions to regularization in invariant learning
Neural Computation
Noise injection: theoretical prospects
Neural Computation
Incorporating Invariances in Support Vector Learning Machines
ICANN 96 Proceedings of the 1996 International Conference on Artificial Neural Networks
RCV1: A New Benchmark Collection for Text Categorization Research
The Journal of Machine Learning Research
Estimation of Dependences Based on Empirical Data: Springer Series in Statistics (Springer Series in Statistics)
View-Invariant Pose Recognition Using Multilinear Analysis and the Universum
ISVC '08 Proceedings of the 4th International Symposium on Advances in Visual Computing, Part II
Midpoint-Validation Method for Support Vector Machine Classification
IEICE - Transactions on Information and Systems
Recognizing body poses using multilinear analysis and semi-supervised learning
Pattern Recognition Letters
Selecting informative universum sample for semi-supervised learning
IJCAI'09 Proceedings of the 21st international jont conference on Artifical intelligence
Exponential family sparse coding with applications to self-taught learning
IJCAI'09 Proceedings of the 21st international jont conference on Artifical intelligence
Empirical Study of the Universum SVM Learning for High-Dimensional Data
ICANN '09 Proceedings of the 19th International Conference on Artificial Neural Networks: Part I
Maximum Relative Margin and Data-Dependent Regularization
The Journal of Machine Learning Research
Midpoint-validation algorithm for support vector machine classification
Artificial Life and Robotics
Exploiting separability in large-scale linear support vector machine training
Computational Optimization and Applications
Document clustering with universum
Proceedings of the 34th international ACM SIGIR conference on Research and development in Information Retrieval
Technical Section: Neural network-based symbol recognition using a few labeled samples
Computers and Graphics
Can irrelevant data help semi-supervised learning, why and how?
Proceedings of the 20th ACM international conference on Information and knowledge management
PAKDD'10 Proceedings of the 14th Pacific-Asia conference on Advances in Knowledge Discovery and Data Mining - Volume Part II
Rough margin based core vector machine
PAKDD'10 Proceedings of the 14th Pacific-Asia conference on Advances in Knowledge Discovery and Data Mining - Volume Part I
Twin support vector machine with Universum data
Neural Networks
Structural twin support vector machine for classification
Knowledge-Based Systems
A nonparallel support vector machine for a classification problem with universum learning
Journal of Computational and Applied Mathematics
Hi-index | 0.00 |
In this paper we study a new framework introduced by Vapnik (1998) and Vapnik (2006) that is an alternative capacity concept to the large margin approach. In the particular case of binary classification, we are given a set of labeled examples, and a collection of "non-examples" that do not belong to either class of interest. This collection, called the Universum, allows one to encode prior knowledge by representing meaningful concepts in the same domain as the problem at hand. We describe an algorithm to leverage the Universum by maximizing the number of observed contradictions, and show experimentally that this approach delivers accuracy improvements over using labeled data alone.