The nature of statistical learning theory
The nature of statistical learning theory
Least Squares Support Vector Machine Classifiers
Neural Processing Letters
An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
Proximal support vector machine classifiers
Proceedings of the seventh ACM SIGKDD international conference on Knowledge discovery and data mining
Machine Learning
SSVM: A Smooth Support Vector Machine for Classification
Computational Optimization and Applications
A Tutorial on Support Vector Machines for Pattern Recognition
Data Mining and Knowledge Discovery
Journal of Global Optimization
epsilon-SSVR: A Smooth Support Vector Machine for epsilon-Insensitive Regression
IEEE Transactions on Knowledge and Data Engineering
Multisurface Proximal Support Vector Machine Classification via Generalized Eigenvalues
IEEE Transactions on Pattern Analysis and Machine Intelligence
Twin Support Vector Machines for Pattern Classification
IEEE Transactions on Pattern Analysis and Machine Intelligence
Nonparallel plane proximal classifier
Signal Processing
A study on reduced support vector machines
IEEE Transactions on Neural Networks
Reduced Support Vector Machines: A Statistical Theory
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
A ν-twin support vector machine (ν-TSVM) classifier and its geometric algorithms
Information Sciences: an International Journal
Probabilistic outputs for twin support vector machines
Knowledge-Based Systems
A regularization for the projection twin support vector machine
Knowledge-Based Systems
Extending twin support vector machine classifier for multi-category classification problems
Intelligent Data Analysis
Hi-index | 0.08 |
In our previous research we observed that the nonparallel plane proximal classifier (NPPC) obtained by minimizing two related regularized quadratic optimization problems performs equally with that of other support vector machine classifiers but with a very lower computational cost. NPPC classifies binary patterns by the proximity of it to one of the two nonparallel hyperplanes. Thus to calculate the distance of a pattern from any hyperplane we need the Euclidean norm of the normal vector of the hyperplane. Alternatively, this should be equal to unity. But in the formulation of NPPC these equality constraints were not considered. Without these constraints the solutions of the objective functions do not guarantee to satisfy the constraints. In this work we have reformulated NPPC by considering those equality constraints and solved it by Newton's method and the solution is updated by solving a system of linear equations by conjugate gradient method. The performance of the reformulated NPPC is verified experimentally on several bench mark and synthetic data sets for both linear and nonlinear classifiers. Apart from the technical improvement of adding those constraints in the NPPC formulation, the results indicate enhanced computational efficiency of nonlinear NPPC on large data sets with the proposed NPPC framework.