The use of eye movements in human-computer interaction techniques: what you look at is what you get
ACM Transactions on Information Systems (TOIS) - Special issue on computer—human interaction
Charade: remote control of objects using free-hand gestures
Communications of the ACM - Special issue on computer augmented environments: back to the real world
An HMM-Based Threshold Model Approach for Gesture Recognition
IEEE Transactions on Pattern Analysis and Machine Intelligence
Light widgets: interacting in every-day spaces
Proceedings of the 7th international conference on Intelligent user interfaces
Pre-emptive shadows: eliminating the blinding light from projectors
CHI '02 Extended Abstracts on Human Factors in Computing Systems
Recognizing Temporal Trajectories Using the Condensation Algorithm
FG '98 Proceedings of the 3rd. International Conference on Face & Gesture Recognition
On creating animated presentations
Proceedings of the 2003 ACM SIGGRAPH/Eurographics symposium on Computer animation
Bare-hand human-computer interaction
Proceedings of the 2001 workshop on Perceptive user interfaces
A portable system for anywhere interactions
CHI '04 Extended Abstracts on Human Factors in Computing Systems
Recognition-based gesture spotting in video games
Pattern Recognition Letters
Evaluation of alternative presentation control techniques
CHI '05 Extended Abstracts on Human Factors in Computing Systems
Shadow reaching: a new perspective on interaction for large displays
Proceedings of the 20th annual ACM symposium on User interface software and technology
Observing presenters' use of visual aids to inform the design of classroom presentation software
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
HyperSlides: dynamic presentation prototyping
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
A Gestural Recognition Interface for Intelligent Wheelchair Users
International Journal of Sociotechnology and Knowledge Development
Hi-index | 0.00 |
Driven by the increasing availability of low-cost sensing hardware, gesture-based input is quickly becoming a viable form of interaction for a variety of applications. Electronic presentations (e.g., PowerPoint, Keynote) have long been seen as a natural fit for this form of interaction. However, despite 20 years of prototyping such systems, little is known about how gesture-based input affects presentation dynamics, or how it can be best applied in this context. Instead, past work has focused almost exclusively on recognition algorithms. This paper explicitly addresses these gaps in the literature. Through observations of real-world practices, we first describe the types of gestures presenters naturally make and the purposes these gestures serve when presenting content. We then introduce Maestro, a gesture-based presentation system explicitly designed to support and enhance these existing practices. Finally, we describe the results of a real-world field study in which Maestro was evaluated in a classroom setting for several weeks. Our results indicate that gestures which enable direct interaction with slide content are the most natural fit for this input modality. In contrast, we found that using gestures to navigate slides (the most common implementation in all prior systems) has significant drawbacks. Our results also show how gesture-based input can noticeably alter presentation dynamics, often in ways that are not desirable.