Improving GUI accessibility for people with low vision
CHI '95 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Assets '98 Proceedings of the third international ACM conference on Assistive technologies
A framework of assistive pointers for low vision users
Assets '00 Proceedings of the fourth international ACM conference on Assistive technologies
Sonically-enhanced drag and drop
ICAD'98 Proceedings of the 1998 international conference on Auditory Display
Perceiving ordinal data haptically under workload
ICMI '05 Proceedings of the 7th international conference on Multimodal interfaces
Haptic comparison of size (relative magnitude) in blind and sighted people
Proceedings of the 9th international ACM SIGACCESS conference on Computers and accessibility
Developing and evaluating a non-visual memory game
INTERACT'11 Proceedings of the 13th IFIP TC 13 international conference on Human-computer interaction - Volume Part II
Non-visual access to GUIs: leveraging abstract user interfaces
ICCHP'06 Proceedings of the 10th international conference on Computers Helping People with Special Needs
Exploring the impact of visual-haptic registration accuracy in augmented reality
EuroHaptics'12 Proceedings of the 2012 international conference on Haptics: perception, devices, mobility, and communication - Volume Part II
Emergent effects in multimodal feedback from virtual buttons
ACM Transactions on Computer-Human Interaction (TOCHI)
Hi-index | 0.00 |
Multimodal interfaces have the potential to enhance a user's overall performance, especially when one perceptual channel, such as vision, is compromised. This research investigated how unimodal, bimodal, and trimodal feedback affected the performance of fully sighted users. Limited research exists that investigates how fully sighted users react to multimodal feedback forms, and to-date even less research is available that has investigated how users with visual impairments respond to multiple forms of feedback. A complex direct manipulation task, consisting of a series search and selection drag-and-drop subtasks, was evaluated in this study. The multiple forms of feedback investigated were auditory, haptic and visual. Each form of feedback was tested alone and in combination. User performance was assessed through measures of workload time. Workload was measured objectively and subjectively, through the physiological measure of pupil diameter and a portion of the NASA Task Load Index (TLX) workload survey, respectively. Time was captured by a measure of how long it took to complete a particular element of the task. The results demonstrate that multimodal feedback improves the performance of fully sighted users and offers great potential to users with visual impairments. As a result, this study serves as a baseline to drive the research and development of effective feedback combinations to enhance performance for individuals with visual impairments.