A Tutorial on Support Vector Machines for Pattern Recognition
Data Mining and Knowledge Discovery
Covering number bounds of certain regularized linear function classes
The Journal of Machine Learning Research
Support Vector Machine Soft Margin Classifiers: Error Analysis
The Journal of Machine Learning Research
Extensions of compressed sensing
Signal Processing - Sparse approximations in signal and image processing
SVM optimization: inverse dependence on training set size
Proceedings of the 25th international conference on Machine learning
Learning from dependent observations
Journal of Multivariate Analysis
Analysis of orthogonal matching pursuit using the restricted isometry property
IEEE Transactions on Information Theory
Personalized mode transductive spanning SVM classification tree
Information Sciences: an International Journal
IEEE Transactions on Information Theory
Hi-index | 0.00 |
In this article, we address the problem of compressed classification learning. A generalization bound of the support vector machines (SVMs) compressed classification algorithm with uniformly ergodic Markov chain samples is established. This bound indicates that the accuracy of the SVM classifier in the compressed domain is close to that of the best classifier in the data domain. In a sense, the fact that the compressed learning can avoid the curse of dimensionality in the learning process is shown. In addition, we show that compressed classification learning reduces the learning time at the price of decreasing the classification accuracy, but the decrement can be controlled. The numerical experiments further verify the results claimed in this article.