Issues in combining marking and direct manipulation techniques
UIST '91 Proceedings of the 4th annual ACM symposium on User interface software and technology
Sensing techniques for mobile interaction
UIST '00 Proceedings of the 13th annual ACM symposium on User interface software and technology
A three-state model of graphical input
INTERACT '90 Proceedings of the IFIP TC13 Third Interational Conference on Human-Computer Interaction
Design and analysis of delimiters for selection-action pen gesture phrases in scriboli
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Tap input as an embedded interaction method for mobile devices
Proceedings of the 1st international conference on Tangible and embedded interaction
Interacting with Digital Media at Home via a Second Screen
ISMW '07 Proceedings of the Ninth IEEE International Symposium on Multimedia Workshops
Rubbing and tapping for precise and rapid selection on touch-screen displays
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Activity sensing in the wild: a field trial of ubifit garden
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
MicroRolls: expanding touch-screen input vocabulary by distinguishing rolls vs. slides of the thumb
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Bezel swipe: conflict-free scrolling and multiple selection on mobile touch screen devices
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Whack gestures: inexact and inattentive interaction with mobile devices
Proceedings of the fourth international conference on Tangible, embedded, and embodied interaction
Proceedings of the International Conference on Advanced Visual Interfaces
An analysis of power consumption in a smartphone
USENIXATC'10 Proceedings of the 2010 USENIX conference on USENIX annual technical conference
The Jigsaw continuous sensing engine for mobile phone applications
Proceedings of the 8th ACM Conference on Embedded Networked Sensor Systems
Enabling mobile microinteractions
Enabling mobile microinteractions
Experimental analysis of touch-screen gesture designs in mobile environments
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Usable gestures for blind people: understanding preference and performance
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Nenya: subtle and eyes-free mobile input with a magnetically-tracked finger ring
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
DoubleFlip: a motion gesture delimiter for mobile interaction
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Two-handed marking menus for multitouch devices
ACM Transactions on Computer-Human Interaction (TOCHI)
Forcetap: extending the input vocabulary of mobile touch screens by adding tap gestures
Proceedings of the 13th International Conference on Human Computer Interaction with Mobile Devices and Services
The 1line keyboard: a QWERTY layout in a single line
Proceedings of the 24th annual ACM symposium on User interface software and technology
TapSense: enhancing finger interaction on touch surfaces
Proceedings of the 24th annual ACM symposium on User interface software and technology
JerkTilts: using accelerometers for eight-choice selection on mobile devices
ICMI '11 Proceedings of the 13th international conference on multimodal interfaces
User learning and performance with bezel menus
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
BiTouch and BiPad: designing bimanual interaction for hand-held tablets
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Understanding and prediction of mobile application usage for smart phones
Proceedings of the 2012 ACM Conference on Ubiquitous Computing
Hi-index | 0.01 |
We present Bezel-Tap Gestures, a novel family of interaction techniques for immediate interaction on handheld tablets regardless of whether the device is alive or in sleep mode. The technique rests on the close succession of two input events: first a bezel tap, whose detection by accelerometers will awake an idle tablet almost instantly, then a screen contact. Field studies confirmed that the probability of this input sequence occurring by chance is very low, excluding the accidental activation concern. One experiment examined the optimal size of the vocabulary of commands for all four regions of the bezel (top, bottom, left, right). Another experiment evaluated two variants of the technique which both allow two-level selection in a hierarchy of commands, the initial bezel tap being followed by either two screen taps or a screen slide. The data suggests that Bezel-Tap Gestures may serve to design large vocabularies of micro-interactions with a sleeping tablet.