The nature of statistical learning theory
The nature of statistical learning theory
Matrix computations (3rd ed.)
Making large-scale support vector machine learning practical
Advances in kernel methods
Fast training of support vector machines using sequential minimal optimization
Advances in kernel methods
Least Squares Support Vector Machine Classifiers
Neural Processing Letters
A Tutorial on Support Vector Machines for Pattern Recognition
Data Mining and Knowledge Discovery
Kernel Methods for Pattern Analysis
Kernel Methods for Pattern Analysis
Rule-Based Learning Systems for Support Vector Machines
Neural Processing Letters
Manifold Regularization: A Geometric Framework for Learning from Labeled and Unlabeled Examples
The Journal of Machine Learning Research
Structured large margin machines: sensitive to data distributions
Machine Learning
Structural Support Vector Machine
ISNN '08 Proceedings of the 5th international symposium on Neural Networks: Advances in Neural Networks
Graph-optimized locality preserving projections
Pattern Recognition
Hi-index | 0.00 |
The least squares support vector machine (LSSVM), like standard support vector machine (SVM) which is based on structural risk minimization, can be obtained by solving a simpler optimization problem than that in SVM. However, local structure information of data samples, especially intrinsic manifold structure, is not taken full consideration in LSSVM. To address this problem and inspired by manifold learning technique, we propose a novel iterative least squares classifier, coined optimal locality preserving least squares support vector machine (OLP-LSSVM). The idea is to combine structural risk minimization and locality preserving criterion in a unified framework to take advantage of the manifold structure of data samples to enhance LSSVM. Furthermore, inspired by the recent development of simultaneous optimization technique, adjacent graph of locality preserving criterion is optimized simultaneously to give rise to improved discriminative performance. The resulting model can be solved by alternating optimization method. The experimental results on several publicly available benchmark data sets show the feasibility and effectiveness of the proposed method.