Bricks: laying the foundations for graspable user interfaces
CHI '95 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Squeeze me, hold me, tilt me! An exploration of manipulative user interfaces
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Sensing techniques for mobile interaction
UIST '00 Proceedings of the 13th annual ACM symposium on User interface software and technology
A piece of butter on the PDA display
CHI '01 Extended Abstracts on Human Factors in Computing Systems
Advanced Interaction in Context
HUC '99 Proceedings of the 1st international symposium on Handheld and Ubiquitous Computing
Phrase sets for evaluating text entry techniques
CHI '03 Extended Abstracts on Human Factors in Computing Systems
What Shall We Teach Our Pants?
ISWC '00 Proceedings of the 4th IEEE International Symposium on Wearable Computers
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
AppLens and launchTile: two designs for one-handed thumb use on small devices
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
The Mobile Sensing Platform: An Embedded Activity Recognition System
IEEE Pervasive Computing
The performance of hand postures in front- and back-of-device interaction for mobile computing
International Journal of Human-Computer Studies
Graspables: grasp-recognition as a user interface
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Expressive typing: a new way to sense typing pressure and its applications
CHI '09 Extended Abstracts on Human Factors in Computing Systems
Hand grip pattern recognition for mobile user interfaces
IAAI'06 Proceedings of the 18th conference on Innovative applications of artificial intelligence - Volume 2
FlyEye: grasp-sensitive surfaces using optical fiber
Proceedings of the fourth international conference on Tangible, embedded, and embodied interaction
Image deblurring using inertial measurement sensors
ACM SIGGRAPH 2010 papers
SqueezeBlock: using virtual springs in mobile devices for eyes-free interaction
UIST '10 Proceedings of the 23nd annual ACM symposium on User interface software and technology
Ability-Based Design: Concept, Principles and Examples
ACM Transactions on Accessible Computing (TACCESS)
Sensor synaesthesia: touch in motion, and motion in touch
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Force gestures: augmented touch screen gestures using normal and tangential force
CHI '11 Extended Abstracts on Human Factors in Computing Systems
Forcetap: extending the input vocabulary of mobile touch screens by adding tap gestures
Proceedings of the 13th International Conference on Human Computer Interaction with Mobile Devices and Services
iRotate: automatic screen rotation based on face orientation
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Touch behavior with different postures on soft smartphone keyboards
MobileHCI '12 Proceedings of the 14th international conference on Human-computer interaction with mobile devices and services
ContextType: using hand posture information to improve mobile touch screen text entry
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
iGrasp: grasp-based adaptive keyboard for mobile devices
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
IrotateGrasp: automatic screen rotation based on grasp of mobile devices
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
VibPress: estimating pressure input using vibration absorption on mobile devices
Proceedings of the 15th international conference on Human-computer interaction with mobile devices and services
Toward compound navigation tasks on mobiles via spatial manipulation
Proceedings of the 15th international conference on Human-computer interaction with mobile devices and services
Touch & activate: adding interactivity to existing objects using active acoustic sensing
Proceedings of the 26th annual ACM symposium on User interface software and technology
BackTap: robust four-point tapping on the back of an off-the-shelf smartphone
Proceedings of the adjunct publication of the 26th annual ACM symposium on User interface software and technology
Motion and context sensing techniques for pen computing
Proceedings of Graphics Interface 2013
Hi-index | 0.00 |
We introduce GripSense, a system that leverages mobile device touchscreens and their built-in inertial sensors and vibration motor to infer hand postures including one- or two-handed interaction, use of thumb or index finger, or use on a table. GripSense also senses the amount of pres-sure a user exerts on the touchscreen despite a lack of direct pressure sensors by inferring from gyroscope readings when the vibration motor is "pulsed." In a controlled study with 10 participants, GripSense accurately differentiated device usage on a table vs. in hand with 99.67% accuracy and when in hand, it inferred hand postures with 84.26% accuracy. In addition, GripSense distinguished three levels of pressure with 95.1% accuracy. A usability analysis of GripSense was conducted in three custom applications and showed that pressure input and hand-posture sensing can be useful in a number of scenarios.