Virtual reality on a WIM: interactive worlds in miniature
CHI '95 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Tilting operations for small screen interfaces
Proceedings of the 9th annual ACM symposium on User interface software and technology
Interaction techniques for 3D modeling on large displays
I3D '01 Proceedings of the 2001 symposium on Interactive 3D graphics
Participatory Design: Principles and Practices
Participatory Design: Principles and Practices
TiltType: accelerometer-supported text entry for very small devices
Proceedings of the 15th annual ACM symposium on User interface software and technology
TiltText: using tilt for text input to mobile phones
Proceedings of the 16th annual ACM symposium on User interface software and technology
Proceedings of the 17th annual ACM symposium on User interface software and technology
Maximizing the guessability of symbolic input
CHI '05 Extended Abstracts on Human Factors in Computing Systems
Interacting with large displays from a distance with vision-tracked multi-finger gestural input
Proceedings of the 18th annual ACM symposium on User interface software and technology
Cooperative gestures: multi-user gestural interactions for co-located groupware
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
HybridTouch: an intuitive manipulation technique for PDAs using their front and rear surfaces
Proceedings of the 8th conference on Human-computer interaction with mobile devices and services
Shallow-depth 3d interaction: design and evaluation of one-, two- and three-touch techniques
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Lucid touch: a see-through mobile device
Proceedings of the 20th annual ACM symposium on User interface software and technology
Proceedings of the 10th international ACM SIGACCESS conference on Computers and accessibility
Bringing physics to the surface
Proceedings of the 21st annual ACM symposium on User interface software and technology
Extending 2D object arrangement with pressure-sensitive layering cues
Proceedings of the 21st annual ACM symposium on User interface software and technology
The performance of hand postures in front- and back-of-device interaction for mobile computing
International Journal of Human-Computer Studies
User-defined gestures for surface computing
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Back-of-device interaction allows creating very small touch devices
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Proceedings of the 11th International Conference on Human-Computer Interaction with Mobile Devices and Services
User evaluation of lightweight user authentication with a single tri-axis accelerometer
Proceedings of the 11th International Conference on Human-Computer Interaction with Mobile Devices and Services
A screen-space formulation for 2D and 3D direct manipulation
Proceedings of the 22nd annual ACM symposium on User interface software and technology
Interactions in the air: adding further depth to interactive tabletops
Proceedings of the 22nd annual ACM symposium on User interface software and technology
ARC-Pad: absolute+relative cursor positioning for large displays with a mobile touchscreen
Proceedings of the 22nd annual ACM symposium on User interface software and technology
Sticky tools: full 6DOF force-based interaction for multi-touch tables
Proceedings of the ACM International Conference on Interactive Tabletops and Surfaces
GesText: accelerometer-based gestural text-entry systems
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Touch projector: mobile interaction through video
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Understanding users' preferences for surface gestures
Proceedings of Graphics Interface 2010
RearType: text entry using keys on the back of a device
Proceedings of the 12th international conference on Human computer interaction with mobile devices and services
Gesture search: a tool for fast mobile data access
UIST '10 Proceedings of the 23nd annual ACM symposium on User interface software and technology
The effect of DOF separation in 3D manipulation tasks with multi-touch displays
Proceedings of the 17th ACM Symposium on Virtual Reality Software and Technology
Integrating 2D mouse emulation with 3D manipulation for visualizations on a multi-touch table
ACM International Conference on Interactive Tabletops and Surfaces
User-defined motion gestures for mobile interaction
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Gesture avatar: a technique for operating mobile user interfaces using gestures
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Experimental analysis of touch-screen gesture designs in mobile environments
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
2d touching of 3d stereoscopic objects
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Gestures in the wild: studying multi-touch gesture sequences on interactive tabletop exhibits
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Two-handed marking menus for multitouch devices
ACM Transactions on Computer-Human Interaction (TOCHI)
Using mobile phones to interact with tabletop computers
Proceedings of the ACM International Conference on Interactive Tabletops and Surfaces
Study of interaction concepts in 3D virtual environment
HCI'13 Proceedings of the 15th international conference on Human-Computer Interaction: interaction modalities and techniques - Volume Part IV
Hacking the Gestures of Past for Future Interactions
Proceedings of International Conference on Advances in Mobile Computing & Multimedia
Hi-index | 0.00 |
One form of input for interacting with large shared surfaces is through mobile devices. These personal devices provide interactive displays as well as numerous sensors to effectuate gestures for input. We examine the possibility of using surface and motion gestures on mobile devices for interacting with 3D objects on large surfaces. If effective use of such devices is possible over large displays, then users can collaborate and carry out complex 3D manipulation tasks, which are not trivial to do. In an attempt to generate design guidelines for this type of interaction, we conducted a guessability study with a dual-surface concept device, which provides users access to information through both its front and back. We elicited a set of end-user surface- and motion-based gestures. Based on our results, we demonstrate reasonably good agreement between gestures for choice of sensory (i.e. tilt), multi-touch and dual-surface input. In this paper we report the results of the guessability study and the design of the gesture-based interface for 3D manipulation.