An Eigendecomposition Approach to Weighted Graph Matching Problems
IEEE Transactions on Pattern Analysis and Machine Intelligence
The nature of statistical learning theory
The nature of statistical learning theory
A graph distance metric based on the maximal common subgraph
Pattern Recognition Letters
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Computers and Intractability; A Guide to the Theory of NP-Completeness
Computers and Intractability; A Guide to the Theory of NP-Completeness
Pattern Vectors from Algebraic Graph Theory
IEEE Transactions on Pattern Analysis and Machine Intelligence
Pattern Recognition and Machine Learning (Information Science and Statistics)
Pattern Recognition and Machine Learning (Information Science and Statistics)
IAM Graph Database Repository for Graph Based Pattern Recognition and Machine Learning
SSPR & SPR '08 Proceedings of the 2008 Joint IAPR International Workshop on Structural, Syntactic, and Statistical Pattern Recognition
Bipartite graph matching for computing the edit distance of graphs
GbRPR'07 Proceedings of the 6th IAPR-TC-15 international conference on Graph-based representations in pattern recognition
Graph embedding in vector spaces by means of prototype selection
GbRPR'07 Proceedings of the 6th IAPR-TC-15 international conference on Graph-based representations in pattern recognition
GbRPR'11 Proceedings of the 8th international conference on Graph-based representations in pattern recognition
Inexact graph matching for structural pattern recognition
Pattern Recognition Letters
Hi-index | 0.00 |
In this paper, the classification power of the eigenvalues of six graph-associated matrices is investigated and evaluated on a benchmark dataset for optical character recognition. The extracted eigenvalues were utilized as feature vectors for multi-class classification using support vector machines. Each graph-associated matrix contains a certain type of geometric/spacial information, which may be important for the classification process. Classification results are presented for all six feature types, as well as for classifier combinations at decision level. For the decision level combination probabilistic output support vector machines have been applied. The eigenvalues of the weighted adjacency matrix provided the best classification rate of 89.9 %. Here, almost half of the misclassified letters are confusion pairs, such as I-L and N-Z. This classification performance can be increased by decision fusion, using the sum rule, to 92.4 %.