Assets '96 Proceedings of the second annual ACM conference on Assistive technologies
Audiograf: a diagram-reader for the blind
Assets '96 Proceedings of the second annual ACM conference on Assistive technologies
IEEE Spectrum
The pantograph: a large workspace haptic device for multimodal human computer interaction
CHI '94 Conference Companion on Human Factors in Computing Systems
Haptic virtual reality for blind computer users
Assets '98 Proceedings of the third international ACM conference on Assistive technologies
Tactile imaging using watershed-based image segmentation
Assets '00 Proceedings of the fourth international ACM conference on Assistive technologies
A study of blind drawing practice: creating graphical information without the visual channel
Assets '00 Proceedings of the fourth international ACM conference on Assistive technologies
Effects of Navigation and Position on Task When Presenting Diagrams
DIAGRAMS '02 Proceedings of the Second International Conference on Diagrammatic Representation and Inference
Access to Mathematics by Blind Students - Introduction to the Special Thematic Session
ICCHP '02 Proceedings of the 8th International Conference on Computers Helping People with Special Needs
Virtual Reality Technology
Image pre-compensation to facilitate computer access for users with refractive errors
Assets '04 Proceedings of the 6th international ACM SIGACCESS conference on Computers and accessibility
ICMI '05 Proceedings of the 7th international conference on Multimodal interfaces
A wearable face recognition system for individuals with visual impairments
Proceedings of the 7th international ACM SIGACCESS conference on Computers and accessibility
Feeling what you hear: tactile feedback for navigation of audio graphs
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
A multi-domain approach for enhancing text display for users with visual aberrations
Proceedings of the 8th international ACM SIGACCESS conference on Computers and accessibility
Accommodating color blind computer users
Proceedings of the 8th international ACM SIGACCESS conference on Computers and accessibility
Sensory substitution using tactile pin arrays: human factors, technology and applications
Signal Processing - Special section: Multimodal human-computer interfaces
Haptic Rendering of Visual Data for the Visually Impaired
IEEE MultiMedia
Fuzzy-rule-based object identification methodology for NAVI system
EURASIP Journal on Applied Signal Processing
Transforming 3D coloured pixels into musical instrument notes for vision substitution applications
Journal on Image and Video Processing
Eye tracking in coloured image scenes represented by ambisonic fields of musical instrument sounds
IWINAC'05 Proceedings of the First international conference on Mechanisms, Symbols, and Models Underlying Cognition: interplay between natural and artificial computation - Volume Part I
Computer vision based travel aid for the blind crossing roads
ACIVS'06 Proceedings of the 8th international conference on Advanced Concepts For Intelligent Vision Systems
Computer vision-based terrain sensors for blind wheelchair users
ICCHP'06 Proceedings of the 10th international conference on Computers Helping People with Special Needs
See ColOr: Seeing Colours with an Orchestra
Human Machine Interaction
Integrating RFID on event-based hemispheric imaging for internet of things assistive applications
Proceedings of the 3rd International Conference on PErvasive Technologies Related to Assistive Environments
Verbally annotated tactile maps - challenges and approaches
SC'10 Proceedings of the 7th international conference on Spatial cognition
Hi-index | 0.00 |
This paper reviews the state of the art in the field of assistive devices for sight-handicapped people. It concentrates in particular on systems that use image and video processing for converting visual data into an alternate rendering modality that will be appropriate for a blind user. Such alternate modalities can be auditory, haptic, or a combination of both. There is thus the need for modality conversion, from the visual modality to another one; this is where image and video processing plays a crucial role. The possible alternate sensory channels are examined with the purpose of using them to present visual information to totally blind persons. Aids that are either already existing or still under development are then presented, where a distinction is made according to the final output channel. Haptic encoding is the most often used by means of either tactile or combined tactile/kinesthetic encoding of the visual data. Auditory encoding may lead to low-cost devices, but there is need to handle high information loss incurred when transforming visual data to auditory one. Despite a higher technical complexity, audio/haptic encoding has the advantage of making use of all available user's sensory channels.