Chunking and phrasing and the design of human-computer dialogues
Human-computer interaction
Pick-and-drop: a direct manipulation technique for multiple computer environments
Proceedings of the 10th annual ACM symposium on User interface software and technology
i-LAND: an interactive landscape for creativity and innovation
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
Augmented surfaces: a spatially continuous work space for hybrid computing environments
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
The Interactive Workspaces Project: Experiences with Ubiquitous Computing Rooms
IEEE Pervasive Computing
Synchronous gestures for multiple persons and computers
Proceedings of the 16th annual ACM symposium on User interface software and technology
ARIS: an interface for application relocation in an interactive space
GI '04 Proceedings of the 2004 Graphics Interface Conference
The introduction of a shared interactive surface into a communal space
CSCW '04 Proceedings of the 2004 ACM conference on Computer supported cooperative work
A study on the manipulation of 2D objects in a projector/camera-based augmented reality environment
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
A study of hand shape use in tabletop gesture interaction
CHI '06 Extended Abstracts on Human Factors in Computing Systems
Swordfish: user tailored workspaces in multi-display environments
CHI '06 Extended Abstracts on Human Factors in Computing Systems
E-conic: a perspective-aware interface for multi-display environments
Proceedings of the 20th annual ACM symposium on User interface software and technology
User-defined gestures for surface computing
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Natural throw and tilt interaction between mobile phones and distant displays
CHI '09 Extended Abstracts on Human Factors in Computing Systems
Chucking: A One-Handed Document Sharing Technique
INTERACT '09 Proceedings of the 12th IFIP TC 13 International Conference on Human-Computer Interaction: Part II
Investigating multi-touch and pen gestures for diagram editing on interactive surfaces
Proceedings of the ACM International Conference on Interactive Tabletops and Surfaces
Touch projector: mobile interaction through video
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Understanding users' preferences for surface gestures
Proceedings of Graphics Interface 2010
Proceedings of the International Conference on Advanced Visual Interfaces
User-defined gestures for connecting mobile phones, public displays, and tabletops
Proceedings of the 12th international conference on Human computer interaction with mobile devices and services
Combining multiple depth cameras and projectors for interactions on, above and between surfaces
UIST '10 Proceedings of the 23nd annual ACM symposium on User interface software and technology
Proxemic interaction: designing for a proximity and orientation-aware environment
ACM International Conference on Interactive Tabletops and Surfaces
Code space: touch + air gesture hybrid interactions for supporting developer meetings
Proceedings of the ACM International Conference on Interactive Tabletops and Surfaces
LACOME: a multi-user collaboration system for shared large displays
Proceedings of the ACM 2012 conference on Computer Supported Cooperative Work Companion
Studying user-defined iPad gestures for interaction in multi-display environment
Proceedings of the 2012 ACM international conference on Intelligent User Interfaces
From small screens to big displays: understanding interaction in multi-display environments
Proceedings of the companion publication of the 2013 international conference on Intelligent user interfaces companion
SkyHunter: a multi-surface environment for supporting oil and gas exploration
Proceedings of the 2013 ACM international conference on Interactive tabletops and surfaces
Hi-index | 0.00 |
Multi-display environments (MDEs) have advanced rapidly in recent years, incorporating multi-touch tabletops, tablets, wall displays and even position tracking systems. Designers have proposed a variety of interesting gestures for use in an MDE, some of which involve a user moving their hands, arms, body or even a device itself. These gestures are often used as part of interactions to move data between the various components of an MDE, which is a longstanding research problem. But designers, not users, have created most of these gestures and concerns over implementation issues such as recognition may have influenced their design. We performed a user study to elicit these gestures directly from users, but found a low level of convergence among the gestures produced. This lack of agreement is important and we discuss its possible causes and the implication it has for designers. To assist designers, we present the most prevalent gestures and some of the underlying conceptual themes behind them. We also provide analysis of how certain factors such as distance and device type impact the choice of gestures and discuss how to apply them to real-world systems.