What you look at is what you get: eye movement-based interaction techniques
CHI '90 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Instrumental interaction: an interaction model for designing post-WIMP user interfaces
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
Nomadic radio: speech and audio interaction for contextual messaging in nomadic environments
ACM Transactions on Computer-Human Interaction (TOCHI) - Special issue on human-computer interaction with mobile systems
Lucid touch: a see-through mobile device
Proceedings of the 20th annual ACM symposium on User interface software and technology
Snap clutch, a moded approach to solving the Midas touch problem
Proceedings of the 2008 symposium on Eye tracking research & applications
HandSense: discriminating different ways of grasping and holding a tangible user interface
Proceedings of the 3rd International Conference on Tangible and Embedded Interaction
Back-of-device interaction allows creating very small touch devices
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Hoverflow: exploring around-device interaction with IR distance sensors
Proceedings of the 11th International Conference on Human-Computer Interaction with Mobile Devices and Services
Distal tactile feedback for text entry on tabletop computers
Proceedings of the 23rd British HCI Group Annual Conference on People and Computers: Celebrating People and Technology
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
INTERACT'11 Proceedings of the 13th IFIP TC 13 international conference on Human-computer interaction - Volume Part I
An investigation into the use of tactons to present progress information
INTERACT'05 Proceedings of the 2005 IFIP TC13 international conference on Human-Computer Interaction
Does proprioception guide back-of-device pointing as well as vision?
CHI '12 Extended Abstracts on Human Factors in Computing Systems
Design principles of hand gesture interfaces for microinteractions
Proceedings of the 6th International Conference on Designing Pleasurable Products and Interfaces
Hi-index | 0.00 |
This paper presents research how a finger-gesture design space for interacting with hand-held tablets may be defined. The parameters that limit or extend this space, such as anatomy-dependent gesture feasibility, grasp requirement, gesture occlusion or complexity, are discussed based on initial explorative expert interviews and following user studies. The goal of this research is defining parameters that have to be taken into account for developing a finger-gesture UI model for hand-held tablets. Although this model has a strong user-centric design approach, rather than being technology driven, technical solutions for detecting finger gestures are also considered. A model design is presented and research questions for investigating this model in greater detail are outlined.