Real-Time American Sign Language Recognition Using Desk and Wearable Computer Based Video
IEEE Transactions on Pattern Analysis and Machine Intelligence
Sensing techniques for mobile interaction
UIST '00 Proceedings of the 13th annual ACM symposium on User interface software and technology
ASL Recognition Based on a Coupling Between HMMs and 3D Motion Analysis
ICCV '98 Proceedings of the Sixth International Conference on Computer Vision
Georgia tech gesture toolkit: supporting experiments in gesture recognition
Proceedings of the 5th international conference on Multimodal interfaces
EdgeWrite: a stylus-based text entry method designed for high accuracy and stability of motion
Proceedings of the 16th annual ACM symposium on User interface software and technology
Multi-finger and whole hand gestural interaction techniques for multi-user tabletop displays
Proceedings of the 16th annual ACM symposium on User interface software and technology
SHARK2: a large vocabulary shorthand writing system for pen-based computers
Proceedings of the 17th annual ACM symposium on User interface software and technology
Multi-finger gestural interaction with 3d volumetric displays
Proceedings of the 17th annual ACM symposium on User interface software and technology
Interacting with large displays from a distance with vision-tracked multi-finger gestural input
Proceedings of the 18th annual ACM symposium on User interface software and technology
Recognizing Mimicked Autistic Self-Stimulatory Behaviors Using HMMs
ISWC '05 Proceedings of the Ninth IEEE International Symposium on Wearable Computers
Proceedings of the 13th International Conference on Human-Computer Interaction. Part II: Novel Interaction Methods and Techniques
PyGmI: creation and evaluation of a portable gestural interface
Proceedings of the 6th Nordic Conference on Human-Computer Interaction: Extending Boundaries
A trajectory-based approach for device independent gesture recognition in multimodal user interfaces
HAID'10 Proceedings of the 5th international conference on Haptic and audio interaction design
ICMI '11 Proceedings of the 13th international conference on multimodal interfaces
Proceedings of the 9th ACM Conference on Embedded Networked Sensor Systems
COGNITO: a cognitive assistance and training system for manual tasks in industry
Proceedings of the 29th Annual European Conference on Cognitive Ergonomics
Proceedings of the 2012 ACM Conference on Ubiquitous Computing
Designing graphical user interfaces integrating gestures
Proceedings of the 30th ACM international conference on Design of communication
The impact of motion dimensionality and bit cardinality on the design of 3D gesture recognizers
International Journal of Human-Computer Studies
Interactive prototyping of tabletop and surface applications
Proceedings of the 5th ACM SIGCHI symposium on Engineering interactive computing systems
The Journal of Machine Learning Research
Sign language recognition using sub-units
The Journal of Machine Learning Research
Hi-index | 0.00 |
The Gesture and Activity Recognition Toolit (GART) is a user interface toolkit designed to enable the development of gesture-based applications. GART provides an abstraction to machine learning algorithms suitable for modeling and recognizing different types of gestures. The toolkit also provides support for the data collection and the training process. In this paper, we present GART and its machine learning abstractions. Furthermore, we detail the components of the toolkit and present two example gesture recognition applications.