The nature of statistical learning theory
The nature of statistical learning theory
Machine Learning
Least Squares Support Vector Machine Classifiers
Neural Processing Letters
Atomic Decomposition by Basis Pursuit
SIAM Review
A Tutorial on Support Vector Machines for Pattern Recognition
Data Mining and Knowledge Discovery
Text Categorization with Suport Vector Machines: Learning with Many Relevant Features
ECML '98 Proceedings of the 10th European Conference on Machine Learning
Training Support Vector Machines: an Application to Face Detection
CVPR '97 Proceedings of the 1997 Conference on Computer Vision and Pattern Recognition (CVPR '97)
Benchmarking Least Squares Support Vector Machine Classifiers
Machine Learning
A tutorial on support vector regression
Statistics and Computing
Computers in Biology and Medicine
Expert Systems with Applications: An International Journal
Sparse and Redundant Representations: From Theory to Applications in Signal and Image Processing
Sparse and Redundant Representations: From Theory to Applications in Signal and Image Processing
Sparse signal reconstruction from limited data using FOCUSS: are-weighted minimum norm algorithm
IEEE Transactions on Signal Processing
Matching pursuits with time-frequency dictionaries
IEEE Transactions on Signal Processing
IEEE Transactions on Information Theory
Signal Recovery From Random Measurements Via Orthogonal Matching Pursuit
IEEE Transactions on Information Theory
Support vector machines for histogram-based image classification
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
Pruning error minimization in least squares support vector machines
IEEE Transactions on Neural Networks
Comments on “Pruning Error Minimization in Least Squares Support Vector Machines”
IEEE Transactions on Neural Networks
Hi-index | 0.01 |
Among the support vector machines, Least Square Support Vector Machine (LSSVM) is computationally attractive for reducing a set of inequality constraints to linear equations. Several pruning algorithms have been developed to obtain reduced support vectors to improve the generalization performance of LSSVM. However, most of the pruning algorithms iteratively select the support vectors, which is of high computation complexity. In this paper, inspired by the recently developed compressive sampling theory, a one-step compressive pruning strategy is proposed to construct a sparse LSSVM without the remarkable reduction of accuracy. It is a fast, universal and information-preserved pruning approach that can avoid the intensive computations in iterative retraining. Some experiments on pattern recognition and function approximation are taken to compare the proposed method with the available pruning approaches, and the results show the feasibility of the proposed method and the superiority to its counterparts.