Nomadic radio: speech and audio interaction for contextual messaging in nomadic environments
ACM Transactions on Computer-Human Interaction (TOCHI) - Special issue on human-computer interaction with mobile systems
Gestural and audio metaphors as a means of control for mobile devices
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Multimodal 'eyes-free' interaction techniques for wearable devices
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Shoogle: excitatory multimodal interaction on mobile devices
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Blindsight: eyes-free access to mobile phones
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Exploring gestural mode of interaction with mobile phones
CHI '08 Extended Abstracts on Human Factors in Computing Systems
Usable gestures for mobile interfaces: evaluating social acceptability
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Body, movement, gesture & tactility in interaction with mobile devices
Proceedings of the 13th International Conference on Human Computer Interaction with Mobile Devices and Services
INTERACT'11 Proceedings of the 13th IFIP TC 13 international conference on Human-computer interaction - Volume Part I
Microinteractions to augment manual tasks
INTERACT'11 Proceedings of the 13th IFIP TC 13 international conference on Human-computer interaction - Volume Part IV
Hi-index | 0.00 |
Nowadays, mobile devices provide new possibilities for gesture interaction due to the large range of embedded sensors they have and their physical form factor. In addition, auditory interfaces can now be more easily supported through advanced mobile computing capabilities. Although different types of gesture techniques have been proposed for handheld devices, there is still little knowledge about the acceptability and use of some of these techniques, especially in the context of an auditory interface. In this paper, we propose a novel approach to the problem by studying the design space of gestures proposed by end-users for a mobile auditory interface. We discuss the results of this explorative study, in terms of the scope of the gestures proposed, the tangible aspects, and the users' preferences. This study delivers some initial gestures recommendations for eyes-free auditory interfaces.