Robust classifiers without robust features
Neural Computation
Machine Learning
Information Retrieval
HMM-based efficient sketch recognition
Proceedings of the 10th international conference on Intelligent user interfaces
A graph model for unsupervised lexical acquisition
COLING '02 Proceedings of the 19th international conference on Computational linguistics - Volume 1
Robust classification with context-sensitive features
IEA/AIE'93 Proceedings of the 6th international conference on Industrial and engineering applications of artificial intelligence and expert systems
Data Mining: Practical Machine Learning Tools and Techniques, Second Edition (Morgan Kaufmann Series in Data Management Systems)
Ambiguities in Sketch-Based Interfaces
HICSS '07 Proceedings of the 40th Annual Hawaii International Conference on System Sciences
A robust approach to text line grouping in online handwritten Japanese documents
Pattern Recognition
A mathematical model for context and word-meaning
CONTEXT'03 Proceedings of the 4th international and interdisciplinary conference on Modeling and using context
Robust classification of strokes with SVM and grouping
ISVC'07 Proceedings of the 3rd international conference on Advances in visual computing - Volume Part I
Lg: a computational framework for research in sketch-based interfaces
SG'11 Proceedings of the 11th international conference on Smart graphics
Hi-index | 0.00 |
In this paper, we investigate how discourse context in the form of short-term memory can be exploited to automatically group consecutive strokes in digital freehand sketching. With this machine learning approach, no database of explicit object representations is used for template matching on a complete scene-instead, grouping decisions are based on limited spatio-temporal context. We employ two different classifier formalisms for this time series analysis task, namely Echo State Networks (ESNs) and Support Vector Machines (SVMs). ESNs present internal-state classifiers with inherent memory capabilities. For the conventional static SVM, short-term memory is supplied externally via fixed-length feature vector expansion. We compare the respective setup heuristics and conduct experiments with two exemplary problems. Promising results are achieved with both formalisms. Yet, our experiments indicate that using ESNs for variable-length memory tasks alleviates the risk of overfitting due to non-expressive features or improperly determined temporal embedding dimensions.