Ho--Kashyap classifier with generalization control
Pattern Recognition Letters
A robust minimax approach to classification
The Journal of Machine Learning Research
Two-Dimensional PCA: A New Approach to Appearance-Based Face Representation and Recognition
IEEE Transactions on Pattern Analysis and Machine Intelligence
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
Learning large margin classifiers locally and globally
ICML '04 Proceedings of the twenty-first international conference on Machine learning
SVM vs Regularized Least Squares Classification
ICPR '04 Proceedings of the Pattern Recognition, 17th International Conference on (ICPR'04) Volume 1 - Volume 01
Determining the Number of Clusters/Segments in Hierarchical Clustering/Segmentation Algorithms
ICTAI '04 Proceedings of the 16th IEEE International Conference on Tools with Artificial Intelligence
Rank-R Approximation of Tensors: Using Image-as-Matrix Representation
CVPR '05 Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05) - Volume 2 - Volume 02
Feature extraction approaches based on matrix pattern: MatPCA and MatFLDA
Pattern Recognition Letters
Generalized Low Rank Approximations of Matrices
Machine Learning
Neural Networks - 2005 Special issue: IJCNN 2005
Matrix-pattern-oriented Ho-Kashyap classifier with regularization learning
Pattern Recognition
Manifold Regularization: A Geometric Framework for Learning from Labeled and Unlabeled Examples
The Journal of Machine Learning Research
Structured large margin machines: sensitive to data distributions
Machine Learning
New Least Squares Support Vector Machines Based on Matrix Patterns
Neural Processing Letters
On Pairwise Kernels: An Efficient Alternative and Generalization Analysis
PAKDD '09 Proceedings of the 13th Pacific-Asia Conference on Advances in Knowledge Discovery and Data Mining
Generalized discriminant analysis: a matrix exponential approach
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
2D-LDA: A statistical linear discriminant analysis for image matrix
Pattern Recognition Letters
A novel multi-view learning developed from single-view patterns
Pattern Recognition
Survey on LBP based texture descriptors for image classification
Expert Systems with Applications: An International Journal
Comments on “On Image Matrix Based Feature Extraction Algorithms”
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
An overview of statistical learning theory
IEEE Transactions on Neural Networks
An introduction to kernel-based learning algorithms
IEEE Transactions on Neural Networks
A comparison of methods for multiclass support vector machines
IEEE Transactions on Neural Networks
Structural Regularized Support Vector Machine: A Framework for Structural Large Margin Classifier
IEEE Transactions on Neural Networks
Hi-index | 0.01 |
The traditional vectorized classifier is supposed to incorporate the class structural information but ignore the individual structure of single pattern. In contrast, the matrixized classifier is supposed to consider both the class and the individual structures, and thus gets a superior performance to the vectorized classifier. In this paper, we explore one middle granularity named the cluster between the class and individual, and introduce the cluster structure that means the structure within each class into the matrixized classifier design. Doing so can simultaneously utilize the class, the cluster, and the individual structures in the way that is from global to point. Therefore, the proposed classifier design here owns the three-fold structural information, and can bring the classification performance to an improving trend. In practice, we adopt the Modification of Ho-Kashyap algorithm with Squared approximation of the misclassification errors (MHKS) as the learning paradigm and develop a Three-fold Structured MHKS named TSMHKS. The advantage of the three-fold structural learning framework is considering different close degrees between samples so as to improve the performance. The experimental results demonstrate the feasibility and effectiveness of the TSMHKS. Furthermore, we discuss the theoretical and experimental generalization bound of the proposed algorithm.