Squeeze me, hold me, tilt me! An exploration of manipulative user interfaces
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Sensing techniques for mobile interaction
UIST '00 Proceedings of the 13th annual ACM symposium on User interface software and technology
A Capacitive Sensing Toolkit for Pervasive Activity Detection and Recognition
PERCOM '07 Proceedings of the Fifth IEEE International Conference on Pervasive Computing and Communications
Lucid touch: a see-through mobile device
Proceedings of the 20th annual ACM symposium on User interface software and technology
SideSight: multi-"touch" interaction around small devices
Proceedings of the 21st annual ACM symposium on User interface software and technology
FlyEye: grasp-sensitive surfaces using optical fiber
Proceedings of the fourth international conference on Tangible, embedded, and embodied interaction
Weight-shifting mobiles: automatic balancing in mobile phones
CHI '10 Extended Abstracts on Human Factors in Computing Systems
Grasp sensing for human-computer interaction
Proceedings of the fifth international conference on Tangible, embedded, and embodied interaction
Some thoughts on a model of touch-sensitive surfaces
ACM International Conference on Interactive Tabletops and Surfaces
Enhancing naturalness of pen-and-tablet drawing through context sensing
Proceedings of the ACM International Conference on Interactive Tabletops and Surfaces
Using mobile phones to interact with tabletop computers
Proceedings of the ACM International Conference on Interactive Tabletops and Surfaces
PinchPad: performance of touch-based gestures while grasping devices
Proceedings of the Sixth International Conference on Tangible, Embedded and Embodied Interaction
BiTouch and BiPad: designing bimanual interaction for hand-held tablets
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
When hand and device melt into a unit: microgestures on grasped objects
CHI '12 Extended Abstracts on Human Factors in Computing Systems
The fat thumb: using the thumb's contact size for single-handed mobile interaction
MobileHCI '12 Proceedings of the 14th international conference on Human-computer interaction with mobile devices and services
iRotate grasp: automatic screen rotation based on grasp of mobile devices
Adjunct proceedings of the 25th annual ACM symposium on User interface software and technology
Design space for finger gestures with hand-held tablets
Proceedings of the 14th ACM international conference on Multimodal interaction
Phoneprioception: enabling mobile phones to infer where they are kept
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
iGrasp: grasp-based adaptive keyboard for mobile devices
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
IrotateGrasp: automatic screen rotation based on grasp of mobile devices
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Side pressure for bidirectional navigation on small devices
Proceedings of the 15th international conference on Human-computer interaction with mobile devices and services
Touch & activate: adding interactivity to existing objects using active acoustic sensing
Proceedings of the 26th annual ACM symposium on User interface software and technology
Exploring back-of-device interaction
Proceedings of the adjunct publication of the 26th annual ACM symposium on User interface software and technology
Hi-index | 0.00 |
As mobile and tangible devices are getting smaller and smaller it is desirable to extend the interaction area to their whole surface area. The HandSense prototype employs capacitive sensors for detecting when it is touched or held against a body part. HandSense is also able to detect in which hand the device is held, and how. The general properties of our approach were confirmed by a user study. HandSense was able to correctly classify over 80 percent of all touches, discriminating six different ways of touching the device (hold left/right, pick up left/right, pick up at top/bottom). This information can be used to implement or enhance implicit and explicit interaction with mobile phones and other tangible user interfaces. For example, graphical user interfaces can be adjusted to the user's handedness.