Cost-conscious classifier ensembles
Pattern Recognition Letters
Detection and recognition of erasures in on-line captured paper forms
Pattern Recognition Letters
Ensemble clustering with voting active clusters
Pattern Recognition Letters
Incremental construction of classifier and discriminant ensembles
Information Sciences: an International Journal
A visual approach to sketched symbol recognition
IJCAI'09 Proceedings of the 21st international jont conference on Artifical intelligence
Representation and feature selection using multiple kernel learning
IJCNN'09 Proceedings of the 2009 international joint conference on Neural Networks
Cost-conscious multiple kernel learning
Pattern Recognition Letters
Eigenclassifiers for combining correlated classifiers
Information Sciences: an International Journal
The dissimilarity space: Bridging structural and statistical pattern recognition
Pattern Recognition Letters
HBF49 feature set: A first unified baseline for online symbol recognition
Pattern Recognition
Iterative classification for multiple target attributes
Journal of Intelligent Information Systems
Global and local features for recognition of online handwritten numerals and Tamil characters
Proceedings of the 4th International Workshop on Multilingual OCR
Hi-index | 0.00 |
We investigate techniques to combine multiple representations of a handwritten digit to increase classification accuracy without significantly increasing system complexity or recognition time. We compare multiexpert and multistage combination techniques and discuss in detail in a comparative manner methods for combining multiple learners: Voting, mixture of experts, stacking, boosting and cascading. In pen-based handwritten character recognition, the input is the dynamic movement of the pentip over the pressure sensitive tablet. There is also the image formed as a result of this movement. On a real-world database, we notice that the two multi-layer perceptron (MLP) neural network-based classifiers using separately these representations make errors on different patterns implying that a suitable combination of the two would lead to higher accuracy. Thus we implement and compare voting, mixture of experts, stacking and cascading. Combined classifiers have an error percentage less than individual ones. The final combined system of two MLPs has less complexity and memory requirement than a single k-nearest neighbor using one of the representations.