Charade: remote control of objects using free-hand gestures
Communications of the ACM - Special issue on computer augmented environments: back to the real world
Tangible bits: towards seamless interfaces between people, bits and atoms
Proceedings of the ACM SIGCHI Conference on Human factors in computing systems
Visual Interpretation of Hand Gestures for Human-Computer Interaction: A Review
IEEE Transactions on Pattern Analysis and Machine Intelligence
The control of avatar motion using hand gesture
VRST '98 Proceedings of the ACM symposium on Virtual reality software and technology
An Inertial Measurement Framework for Gesture Recognition and Applications
GW '01 Revised Papers from the International Gesture Workshop on Gesture and Sign Languages in Human-Computer Interaction
Toward Natural Gesture/Speech Control of a Large Display
EHCI '01 Proceedings of the 8th IFIP International Conference on Engineering for Human-Computer Interaction
A Wearable Computer Based American Sign Language Recognizer
ISWC '97 Proceedings of the 1st IEEE International Symposium on Wearable Computers
A Survey of Gesture RecognitionTechniques.
A Survey of Gesture RecognitionTechniques.
Vision based hand gesture interfaces for wearable computing and virtual environments
Vision based hand gesture interfaces for wearable computing and virtual environments
Enabling fast and effortless customisation in accelerometer based gesture interaction
Proceedings of the 3rd international conference on Mobile and ubiquitous multimedia
LIBSVM: A library for support vector machines
ACM Transactions on Intelligent Systems and Technology (TIST)
ERCIM'06 Proceedings of the 9th conference on User interfaces for all
An adaptive solution for intra-operative gesture-based human-machine interaction
Proceedings of the 2012 ACM international conference on Intelligent User Interfaces
Hi-index | 0.24 |
Orientation sensing is considered an important means to implement embedded technology enhanced artifacts (often referred to as 'smart artifacts'), exhibiting embodied means of interaction based on their position, orientation, and the respective dynamics. Considering artifacts subject to manual (or 'by-hand') manipulation by the user, we identify hand worn, hand carried and (hand) graspable real world objects as exhibiting different artifact orientation dynamics, justifying an analysis along these three categories. We refer to orientation dynamics as 'gestures' in an abstract sense, and present a general framework for orientation sensor based gesture recognition. The framework specification is independent of sensor technology and classification methods, and elaborates an application-independent set of gestures. It enables multi sensor interoperability and it accommodates a variable number of sensors. A core component of the framework is a gesture library that contains gestures from three categories: hand gestures, gestures of artifact held permanently and gestures of artifact that are detached from the hand and are manipulated occasionally. An inertial orientation sensing based gesture detection and recognition system is developed and composed into a gesture-based interaction development framework. The use of this framework is demonstrated with the development of tangible remote controls for a media player, both in hardware and in software.