Drishti: An Integrated Navigation System for Visually Impaired and Disabled
ISWC '01 Proceedings of the 5th IEEE International Symposium on Wearable Computers
Acoustic virtual reality performing man-machine interfacing of the blind
ICS'08 Proceedings of the 12th WSEAS international conference on Systems
Generation of the head related transfer functions using artificial neural networks
ICC'09 Proceedings of the 13th WSEAS international conference on Circuits
ICS'10 Proceedings of the 14th WSEAS international conference on Systems: part of the 14th WSEAS CSCC multiconference - Volume II
Biomimetic Sonar System Performing Spectrum-Based Localization
IEEE Transactions on Robotics
The GuideCane-applying mobile robot technologies to assist thevisually impaired
IEEE Transactions on Systems, Man, and Cybernetics, Part A: Systems and Humans
Hi-index | 0.00 |
The new Acoustic Virtual Reality (AVR) concept is often used as a man-machine interface in electronic travel aid (ETA), that help blind and visually impaired individuals to navigate in real outdoor environments. According to this concept, the presence of obstacles in the surrounding environment and the path to the desired target will be signalized to the blind subject by burst of sounds, whose virtual source position suggests the position of the real obstacles and the direction of movement, respectively. The practical implementation of the AVR concept requires the so-called Head Related Transfer Functions (HRTFs) to be known in every point of the 3D space and for each involved individual. In the present paper, we describe an efficient algorithm for extracting essential Head Related Impulse Response (HRIR) data, we apply it for one person's HRIRs from the Listen Ircam HRTF database. To verify its applicability, after describing the experimental setup, listening tests are also conducted and results are compared with the results of listening tests without using the proposed algorithm. Finally, conclusions and future research ideas are also presented.