SmartSkin: an infrastructure for freehand manipulation on interactive surfaces
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
ACM Transactions on Computer-Human Interaction (TOCHI)
Multi-finger and whole hand gestural interaction techniques for multi-user tabletop displays
Proceedings of the 16th annual ACM symposium on User interface software and technology
Two-handed interaction on a tablet display
CHI '04 Extended Abstracts on Human Factors in Computing Systems
SmartCanvas: a gesture-driven intelligent drawing desk system
Proceedings of the 10th international conference on Intelligent user interfaces
ViCAT: Visualisation and Interaction on a Collaborative Access Table
TABLETOP '06 Proceedings of the First IEEE International Workshop on Horizontal Interactive Human-Computer Systems
Gesture Registration, Relaxation, and Reuse for Multi-Point Direct-Touch Surfaces
TABLETOP '06 Proceedings of the First IEEE International Workshop on Horizontal Interactive Human-Computer Systems
Presence disparity in mixed presence collaboration
CHI '08 Extended Abstracts on Human Factors in Computing Systems
Dynamic positioning systems: usability and interaction styles
Proceedings of the 5th Nordic conference on Human-computer interaction: building bridges
A multitouch software architecture
Proceedings of the 5th Nordic conference on Human-computer interaction: building bridges
User-defined gestures for surface computing
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Improving computer interaction for older adults
ACM SIGACCESS Accessibility and Computing
Bimanual Interaction with Interscopic Multi-Touch Surfaces
INTERACT '09 Proceedings of the 12th IFIP TC 13 International Conference on Human-Computer Interaction: Part II
Analysis of natural gestures for controlling robot teams on multi-touch tabletop surfaces
Proceedings of the ACM International Conference on Interactive Tabletops and Surfaces
Hand distinction for multi-touch tabletop interaction
Proceedings of the ACM International Conference on Interactive Tabletops and Surfaces
Investigating multi-touch and pen gestures for diagram editing on interactive surfaces
Proceedings of the ACM International Conference on Interactive Tabletops and Surfaces
How users manipulate deformable displays as input devices
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
mCube: towards a versatile gesture input device for ubiquitous computing environments
UCS'07 Proceedings of the 4th international conference on Ubiquitous computing systems
Understanding users' preferences for surface gestures
Proceedings of Graphics Interface 2010
Diagram editing on interactive displays using multi-touch and pen gestures
Diagrams'10 Proceedings of the 6th international conference on Diagrammatic representation and inference
Usable gestures for blind people: understanding preference and performance
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Integrating touch and near touch interactions for information visualizations
CHI '11 Extended Abstracts on Human Factors in Computing Systems
2D similarity transformations on multi-touch surfaces
Proceedings of Graphics Interface 2011
Usability testing of the interaction of novices with a multi-touch table in semi public space
HCII'11 Proceedings of the 14th international conference on Human-computer interaction: interaction techniques and environments - Volume Part II
INTERACT'11 Proceedings of the 13th IFIP TC 13 international conference on Human-computer interaction - Volume Part III
Designing user-, hand-, and handpart-aware tabletop interactions with the TouchID toolkit
Proceedings of the ACM International Conference on Interactive Tabletops and Surfaces
Towards the establishment of a framework for intuitive multi-touch interaction design
Proceedings of the International Working Conference on Advanced Visual Interfaces
Understanding user gestures for manipulating 3D objects from touchscreen inputs
Proceedings of Graphics Interface 2012
Area gestures for a laptop computer enabled by a hover-tracking touchpad
Proceedings of the 10th asia pacific conference on Computer human interaction
The hold-and-move gesture for multi-touch interfaces
MobileHCI '12 Proceedings of the 14th international conference on Human-computer interaction with mobile devices and services
Eliciting usable gestures for multi-display environments
Proceedings of the 2012 ACM international conference on Interactive tabletops and surfaces
Web on the wall: insights from a multimodal interaction elicitation study
Proceedings of the 2012 ACM international conference on Interactive tabletops and surfaces
Counting on your fingertips: an exploration and analysis of actions in the Rich Touch space
Proceedings of the 3rd International Conference on Human Computer Interaction
Proceedings of the 2013 ACM international joint conference on Pervasive and ubiquitous computing
Hi-index | 0.00 |
Although manual gesture has long been suggested as an intuitive method of input for horizontal human-computer systems, little research has been conducted into observing user preferences for tabletop gesture interaction. This is particularly the case for computer vision-based gesture input, where the recognition of different hand shapes opens up a new vocabulary of interaction. In this paper, results from an observational study of manual gesture input for a tabletop display are discussed. Implications for tabletop gesture interaction design include suggestions for the use of different hands shapes for input, the desirability of combined touch screen and computer vision gesture input, and possibilities for flexible two-handed interaction.