The nature of statistical learning theory
The nature of statistical learning theory
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Two-Dimensional PCA: A New Approach to Appearance-Based Face Representation and Recognition
IEEE Transactions on Pattern Analysis and Machine Intelligence
Convex Optimization
The Kronecker product and stochastic automata networks
Journal of Computational and Applied Mathematics
Training linear SVMs in linear time
Proceedings of the 12th ACM SIGKDD international conference on Knowledge discovery and data mining
Pattern Recognition and Machine Learning (Information Science and Statistics)
Pattern Recognition and Machine Learning (Information Science and Statistics)
Nearest-Neighbor Methods in Learning and Vision: Theory and Practice (Neural Information Processing)
Nearest-Neighbor Methods in Learning and Vision: Theory and Practice (Neural Information Processing)
Matrix-pattern-oriented Ho-Kashyap classifier with regularization learning
Pattern Recognition
Knowledge and Information Systems
Extracting the optimal dimensionality for local tensor discriminant analysis
Pattern Recognition
A survey of palmprint recognition
Pattern Recognition
Stable local dimensionality reduction approaches
Pattern Recognition
3-D brain MRI tissue classification on FPGAs
IEEE Transactions on Image Processing
2D-LDA: A statistical linear discriminant analysis for image matrix
Pattern Recognition Letters
LIBSVM: A library for support vector machines
ACM Transactions on Intelligent Systems and Technology (TIST)
A novel multi-view learning developed from single-view patterns
Pattern Recognition
Expert Systems with Applications: An International Journal
Factorization Machines with libFM
ACM Transactions on Intelligent Systems and Technology (TIST)
Multilinear Discriminant Analysis for Face Recognition
IEEE Transactions on Image Processing
Pattern Representation in Feature Extraction and Classifier Design: Matrix Versus Vector
IEEE Transactions on Neural Networks
Hi-index | 0.01 |
Matrices, or more generally, multi-way arrays (tensors) are common forms of data that are encountered in a wide range of real applications. How to classify this kind of data is an important research topic for both pattern recognition and machine learning. In this paper, by analyzing the relationship between two famous traditional classification approaches, i.e., SVM and STM, a novel tensor-based method, i.e., multiple rank multi-linear SVM (MRMLSVM), is proposed. Different from traditional vector-based and tensor based methods, multiple-rank left and right projecting vectors are employed to construct decision boundary and establish margin function. We reveal that the rank of transformation can be regarded as a tradeoff parameter to balance the capacity of learning and generalization in essence. We also proposed an effective approach to solve the proposed non-convex optimization problem. The convergence behavior, initialization, computational complexity and parameter determination problems are analyzed. Compared with vector-based classification methods, MRMLSVM achieves higher accuracy and has lower computational complexity. Compared with traditional supervised tensor-based methods, MRMLSVM performs better for matrix data classification. Promising experimental results on various kinds of data sets are provided to show the effectiveness of our method.