Using speech and touch to enable blind people to access schematic diagrams
Journal of Network and Computer Applications
Ten myths of multimodal interaction
Communications of the ACM
The humane interface: new directions for designing interactive systems
The humane interface: new directions for designing interactive systems
CT '01 Proceedings of the 4th International Conference on Cognitive Technology: Instruments of Mind
Care-O-bot II—Development of a Next Generation Robotic Home Assistant
Autonomous Robots
Ergonomics-for-one in a robotic shopping cart for the blind
Proceedings of the 1st ACM SIGCHI/SIGART conference on Human-robot interaction
Architecture for Hybrid Robotic Behavior
HAIS '09 Proceedings of the 4th International Conference on Hybrid Artificial Intelligence Systems
Hi-index | 0.00 |
The availability of inexpensive robotic hardware has brought to realization the dream of having autonomous mobile robots around us. As such, the research community has recently manifested more interest in assisting robotic technology (see proceedings of the last two IEEE RO-MAN conferences, the emergence of the RoboCup@Home challenge at RoboCup and the first annual Human Computer Interaction Conference jointly sponsored by IEEE and ACM). Robots provide to the blind what was lost as textual interfaces were replaced by GUIs. This paper describes the design, implementation and testing of a first prototype of a multi-modal Human-Robot Interface for people with Vision Impairment. The robot used is the commercially available four legged SONY Aibo.