An empirical comparison of pie vs. linear menus
CHI '88 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Nomadic radio: speech and audio interaction for contextual messaging in nomadic environments
ACM Transactions on Computer-Human Interaction (TOCHI) - Special issue on human-computer interaction with mobile systems
Multimodal 'eyes-free' interaction techniques for wearable devices
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
SmartCanvas: a gesture-driven intelligent drawing desk system
Proceedings of the 10th international conference on Intelligent user interfaces
The benefits of augmenting telephone voice menu navigation with visual browsing and search
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Earpod: eyes-free menu selection using touch input and reactive audio feedback
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
A user study of auditory versus visual interfaces for use while driving
International Journal of Human-Computer Studies
Auditory icons: using sound in computer interfaces
Human-Computer Interaction
Earcons and icons: their structure and common design principles
Human-Computer Interaction
A gesture-based and eyes-free control method for mobile devices
CHI '09 Extended Abstracts on Human Factors in Computing Systems
Proceedings of the 1st International Conference on Automotive User Interfaces and Interactive Vehicular Applications
Virtual shelves: interactions with orientation aware devices
Proceedings of the 22nd annual ACM symposium on User interface software and technology
Imaginary interfaces: spatial interaction with empty hands and without visual feedback
UIST '10 Proceedings of the 23nd annual ACM symposium on User interface software and technology
Vision-based hand-gesture applications
Communications of the ACM
Usable gestures for blind people: understanding preference and performance
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Markerless visual fingertip detection for natural mobile device interaction
Proceedings of the 13th International Conference on Human Computer Interaction with Mobile Devices and Services
American sign language recognition with the kinect
ICMI '11 Proceedings of the 13th international conference on multimodal interfaces
Handel, a free-hands gesture recognition system
CMMR'04 Proceedings of the Second international conference on Computer Music Modeling and Retrieval
No-look notes: accessible eyes-free multi-touch text entry
Pervasive'10 Proceedings of the 8th international conference on Pervasive Computing
Controller-free exploration of medical image data: Experiencing the Kinect
CBMS '11 Proceedings of the 2011 24th International Symposium on Computer-Based Medical Systems
Natural, intuitive finger based input as substitution for traditional vehicle control
Proceedings of the 3rd International Conference on Automotive User Interfaces and Interactive Vehicular Applications
Hi-index | 0.00 |
Auditory interfaces can overcome visual interfaces when a primary task, such as driving, competes for the attention of a user controlling a device, such as radio. In emerging interfaces enabled by camera tracking, auditory displays may also provide viable alternatives to visual displays. This paper presents a user study of interoperable auditory and visual menus, in which control gestures remain the same in the visual and the auditory domain. Tested control methods included a novel free-hand gesture interaction with camera-based tracking, and touch screen interaction with a tablet. The task of the participants was to select numbers from a visual or an auditory menu including a circular layout and a numeric keypad layout. Results show, that even with participant's full attention to the task, the performance and accuracy of the auditory interface are the same or even slightly better than the visual when controlled with free-hand gestures. The auditory menu was measured to be slower in touch screen interaction, but questionnaire revealed that over half of the participants felt that the circular auditory menu was faster than the visual menu. Furthermore, visual and auditory feedback in touch screen interaction with numeric layout was measured fastest, touch screen with circular menu second fastest, and the free-hand gesture interface was slowest. The results suggest that auditory menus can potentially provide a fast and desirable interface to control devices with free-hand gestures.