Opposition space and human prehension
Dextrous robot hands
Robot grasp synthesis algorithms: a survey
International Journal of Robotics Research
Personal area networks: near-field intrabody communication
IBM Systems Journal
On computing four-finger equilibrium and force-closure grasps of polyhedral objects
International Journal of Robotics Research
GestureWrist and GesturePad: Unobtrusive Wearable Interaction Devices
ISWC '01 Proceedings of the 5th IEEE International Symposium on Wearable Computers
Two-finger input with a standard touch screen
Proceedings of the 20th annual ACM symposium on User interface software and technology
Dynamic knobs: shape change as a means of interaction on a mobile phone
CHI '08 Extended Abstracts on Human Factors in Computing Systems
The bar of soap: a grasp recognition system implemented in a multi-functional handheld device
CHI '08 Extended Abstracts on Human Factors in Computing Systems
SideSight: multi-"touch" interaction around small devices
Proceedings of the 21st annual ACM symposium on User interface software and technology
HandSense: discriminating different ways of grasping and holding a tangible user interface
Proceedings of the 3rd International Conference on Tangible and Embedded Interaction
Graspables: grasp-recognition as a user interface
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Real-time hand-tracking with a color glove
ACM SIGGRAPH 2009 papers
The UnMousePad: an interpolating multi-touch force-sensing input pad
ACM SIGGRAPH 2009 papers
Hand grip pattern recognition for mobile user interfaces
IAAI'06 Proceedings of the 18th conference on Innovative applications of artificial intelligence - Volume 2
Coming to grips with the objects we grasp: detecting interactions with efficient wrist-worn sensors
Proceedings of the fourth international conference on Tangible, embedded, and embodied interaction
FlyEye: grasp-sensitive surfaces using optical fiber
Proceedings of the fourth international conference on Tangible, embedded, and embodied interaction
Microinteractions to augment manual tasks
INTERACT'11 Proceedings of the 13th IFIP TC 13 international conference on Human-computer interaction - Volume Part IV
Microinteractions for supporting grasp tasks through usage of spare attentional and motor resources
Proceedings of the 29th Annual European Conference on Cognitive Ergonomics
PinchPad: performance of touch-based gestures while grasping devices
Proceedings of the Sixth International Conference on Tangible, Embedded and Embodied Interaction
What I grasp is what I control: interacting through grasp releases
Proceedings of the Sixth International Conference on Tangible, Embedded and Embodied Interaction
Fast Grasp Synthesis for Various Shaped Objects
Computer Graphics Forum
iRotate grasp: automatic screen rotation based on grasp of mobile devices
Adjunct proceedings of the 25th annual ACM symposium on User interface software and technology
Framed guessability: using embodied allegories to increase user agreement on gesture sets
Proceedings of the 8th International Conference on Tangible, Embedded and Embodied Interaction
Hi-index | 0.01 |
The way we grasp an object depends on several factors, e.g. the intended goal or the hand's anatomy. Therefore, a grasp can convey meaningful information about its context. Inferring these factors from a grasp allows us to enhance interaction with grasp-sensitive objects. This paper highlights an grasp as an important source of meaningful context for human-computer interaction and gives an overview of prior work from other disciplines. This paper offers a basis and framework for further research and discussion by proposing a descriptive model of meaning in grasps. The GRASP model combines five factors that determine how an object is grasped: goal, relationship between user and object, anatomy, setting, and properties of the object. The model is validated both from an epistemological perspective and by applying it to scenarios from related work.