Making large-scale support vector machine learning practical
Advances in kernel methods
Machine Learning
Training Invariant Support Vector Machines
Machine Learning
Tangent Vector Kernels for Invariant Image Classification with SVMs
ICPR '04 Proceedings of the Pattern Recognition, 17th International Conference on (ICPR'04) Volume 3 - Volume 03
ICPR '06 Proceedings of the 18th International Conference on Pattern Recognition - Volume 02
Rule Extraction from Support Vector Machines: A Sequential Covering Approach
IEEE Transactions on Knowledge and Data Engineering
Hi-index | 0.00 |
Prior knowledge about a problem domain can be utilized to bias Support Vector Machines (SVMs) towards learning better hypothesis functions. To this end, a number of methods have been proposed that demonstrate improved generalization performance after the application of domain knowledge; especially in the case of scarce training data. In this paper, we propose an extension to the virtual support vectors (VSVs) technique where only a subset of the support vectors (SVs) is utilized. Unlike previous methods, the purpose here is to compensate for noise and uncertainty in the training data. Furthermore, we investigate the effect of domain knowledge not only on the quality of the SVM model, but also on rules extracted from it; hence the learned pattern by the SVM. Results on five benchmark and one real life data sets show that domain knowledge can significantly improve both the quality of the SVM and the rules extracted from it.