Privacy-preserving classification of vertically partitioned data via random kernels
ACM Transactions on Knowledge Discovery from Data (TKDD)
Short term wind speed prediction using support vector machine model
WSEAS Transactions on Computers
Recursive reduced least squares support vector regression
Pattern Recognition
A Robust Support Vector Regression Based on Fuzzy Clustering
IEA/AIE '09 Proceedings of the 22nd International Conference on Industrial, Engineering and Other Applications of Applied Intelligent Systems: Next-Generation Applied Intelligence
Robust kernel principal component analysis
Neural Computation
Analysis of survival data having time-dependent covariates
IEEE Transactions on Neural Networks
ICNC'09 Proceedings of the 5th international conference on Natural computation
IEEE Transactions on Neural Networks
Privacy-preserving outsourcing support vector machines with random transformation
Proceedings of the 16th ACM SIGKDD international conference on Knowledge discovery and data mining
A reduced data set method for support vector regression
Expert Systems with Applications: An International Journal
Framelet kernels with applications to support vector regression and regularization networks
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics - Special issue on gait analysis
Robust 1-norm soft margin smooth support vector machine
IDEAL'10 Proceedings of the 11th international conference on Intelligent data engineering and automated learning
Canonical correlation analysis using within-class coupling
Pattern Recognition Letters
A new algorithm for training SVMs using approximate minimal enclosing balls
CIARP'10 Proceedings of the 15th Iberoamerican congress conference on Progress in pattern recognition, image analysis, computer vision, and applications
A novel multi-view classifier based on Nyström approximation
Expert Systems with Applications: An International Journal
Expert Systems with Applications: An International Journal
Improved conjugate gradient implementation for least squares support vector machines
Pattern Recognition Letters
Individual doctor recommendation model on medical social network
ADMA'11 Proceedings of the 7th international conference on Advanced Data Mining and Applications - Volume Part II
Expert Systems with Applications: An International Journal
A reduced support vector machine approach for interval regression analysis
Information Sciences: an International Journal
Probabilistic support vector machines for classification of noise affected data
Information Sciences: an International Journal
Support vector machines training data selection using a genetic algorithm
SSPR'12/SPR'12 Proceedings of the 2012 Joint IAPR international conference on Structural, Syntactic, and Statistical Pattern Recognition
A proximal classifier with consistency
Knowledge-Based Systems
Asymptotic error bounds for kernel-based Nyström low-rank approximation matrices
Journal of Multivariate Analysis
Inductive manifold learning using structured support vector machine
Pattern Recognition
Isometric sliced inverse regression for nonlinear manifold learning
Statistics and Computing
Nonparallel hyperplane support vector machine for binary classification problems
Information Sciences: an International Journal
Training sparse SVM on the core sets of fitting-planes
Neurocomputing
Least squares twin parametric-margin support vector machine for classification
Applied Intelligence
Smooth support vector learning for fuzzy rule-based classification systems
Intelligent Data Analysis
TS-fuzzy modeling based on ε-insensitive smooth support vector regression
Journal of Intelligent & Fuzzy Systems: Applications in Engineering and Technology
Hi-index | 0.01 |
In dealing with large data sets, the reduced support vector machine (RSVM) was proposed for the practical objective to overcome some computational difficulties as well as to reduce the model complexity. In this paper, we study the RSVM from the viewpoint of sampling design, its robustness, and the spectral analysis of the reduced kernel. We consider the nonlinear separating surface as a mixture of kernels. Instead of a full model, the RSVM uses a reduced mixture with kernels sampled from certain candidate set. Our main results center on two major themes. One is the robustness of the random subset mixture model. The other is the spectral analysis of the reduced kernel. The robustness is judged by a few criteria as follows: 1) model variation measure; 2) model bias (deviation) between the reduced model and the full model; and 3) test power in distinguishing the reduced model from the full one. For the spectral analysis, we compare the eigenstructures of the full kernel matrix and the approximation kernel matrix. The approximation kernels are generated by uniform random subsets. The small discrepancies between them indicate that the approximation kernels can retain most of the relevant information for learning tasks in the full kernel. We focus on some statistical theory of the reduced set method mainly in the context of the RSVM. The use of a uniform random subset is not limited to the RSVM. This approach can act as a supplemental algorithm on top of a basic optimization algorithm, wherein the actual optimization takes place on the subset-approximated data. The statistical properties discussed in this paper are still valid