The nature of statistical learning theory
The nature of statistical learning theory
Pairwise classification and support vector machines
Advances in kernel methods
Least Squares Support Vector Machine Classifiers
Neural Processing Letters
An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
Fuzzy least squares support vector machines for multiclass problems
Neural Networks - 2003 Special issue: Advances in neural networks research IJCNN'03
Two-Dimensional PCA: A New Approach to Appearance-Based Face Representation and Recognition
IEEE Transactions on Pattern Analysis and Machine Intelligence
Rank-R Approximation of Tensors: Using Image-as-Matrix Representation
CVPR '05 Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05) - Volume 2 - Volume 02
Feature extraction approaches based on matrix pattern: MatPCA and MatFLDA
Pattern Recognition Letters
2D-LDA: A statistical linear discriminant analysis for image matrix
Pattern Recognition Letters
Matrix-pattern-oriented least squares support vector classifier with AdaBoost
Pattern Recognition Letters
Tensor-based locally maximum margin classifier for image and video classification
Computer Vision and Image Understanding
A novel multi-view learning developed from single-view patterns
Pattern Recognition
Matrix pattern based minimum within-class scatter support vector machines
Applied Soft Computing
Simple estimate of the width in Gaussian kernel with adaptive scaling technique
Applied Soft Computing
Journal of Medical Systems
Three-fold structured classifier design based on matrix pattern
Pattern Recognition
Hi-index | 0.00 |
Support vector machine (SVM), as an effective method in classification problems, tries to find the optimal hyperplane that maximizes the margin between two classes and can be obtained by solving a constrained optimization criterion using quadratic programming (QP). This QP leads to higher computational cost. Least squares support vector machine (LS-SVM), as a variant of SVM, tries to avoid the above shortcoming and obtain an analytical solution directly from solving a set of linear equations instead of QP. Both SVM and LS-SVM operate directly on patterns represented by vector, i.e., before applying SVM or LS-SVM to a pattern, any non-vector pattern such as an image has to be first vectorized into a vector pattern by some techniques like concatenation. However, some implicit structural or local contextual information may be lost in this transformation. Moreover, as the dimension d of the weight vector in SVM or LS-SVM with the linear kernel is equal to the dimension d 1 脳 d 2 of the original input pattern, as a result, the higher the dimension of a vector pattern is, the more space is needed for storing it. In this paper, inspired by the method of feature extraction directly based on matrix patterns and the advantages of LS-SVM, we propose a new classifier design method based on matrix patterns, called MatLSSVM, such that the new method can not only directly operate on original matrix patterns, but also efficiently reduce memory for the weight vector (d) from d 1 脳 d 2 to d 1 + d 2. However like LS-SVM, MatLSSVM inherits LS-SVM's existence of unclassifiable regions when extended to multi-class problems. Thus with the fuzzy version of LS-SVM, a corresponding fuzzy version of MatLSSVM (MatFLSSVM) is further proposed to remove unclassifiable regions effectively for multi-class problems. Experimental results on some benchmark datasets show that the proposed method is competitive in classification performance compared to LS-SVM, fuzzy LS-SVM (FLS-SVM), more-recent MatPCA and MatFLDA. In addition, more importantly, the idea used here has a possibility of providing a novel way of constructing learning model.