Charade: remote control of objects using free-hand gestures
Communications of the ACM - Special issue on computer augmented environments: back to the real world
Multimodal human discourse: gesture and speech
ACM Transactions on Computer-Human Interaction (TOCHI)
Toward Natural Gesture/Speech Control of a Large Display
EHCI '01 Proceedings of the 8th IFIP International Conference on Engineering for Human-Computer Interaction
Design issues for vision-based computer interaction systems
Proceedings of the 2001 workshop on Perceptive user interfaces
Optimal Hand Gesture Vocabulary Design Using Psycho-Physiological and Technical Factors
FGR '06 Proceedings of the 7th International Conference on Automatic Face and Gesture Recognition
Observation-based design methods for gestural user interfaces
CHI '07 Extended Abstracts on Human Factors in Computing Systems
Design and development of an everyday hand gesture interface
Proceedings of the 10th international conference on Human computer interaction with mobile devices and services
OctoPocus: a dynamic guide for learning gesture-based command sets
Proceedings of the 21st annual ACM symposium on User interface software and technology
Proxemic interactions: the new ubicomp?
interactions
Vision-based hand-gesture applications
Communications of the ACM
ARAMIS: toward a hybrid approach for human- environment interaction
HCII'11 Proceedings of the 14th international conference on Human-computer interaction: towards mobile and intelligent interaction environments - Volume Part III
DomoML: an integrating devices framework for ambient intelligence solutions
Proceedings of the 6th International Workshop on Enhanced Web Service Technologies
Humans and smart environments: a novel multimodal interaction approach
ICMI '11 Proceedings of the 13th international conference on multimodal interfaces
Virtual 3-D interface system via hand motion recognition from two cameras
IEEE Transactions on Systems, Man, and Cybernetics, Part A: Systems and Humans
Hi-index | 0.00 |
In this paper, we describe an opportunistic model for human-environment interaction. Such model is conceived to adapt the expressivity of a small lexicon of gestures through the use of generic functional gestures lowering the cognitive load on the user and reducing the system complexity. An interactive entity is modeled as a finite-state machine. A functional gesture is defined as the semantic meaning of an event that triggers a state transition and not as the movement to be performed. An interaction scenario has been designed in order to evaluate the features of the proposed model and to investigate how its application can enhance a post-WIMP human-environment interaction.