Real world haptic exploration for telepresence of the visually impaired

  • Authors:
  • Chung Hyuk Park;Ayanna M. Howard

  • Affiliations:
  • Georgia Institute of Technology, Atlanta, GA, USA;Georgia Institute of Technology, Atlanta, GA, USA

  • Venue:
  • HRI '12 Proceedings of the seventh annual ACM/IEEE international conference on Human-Robot Interaction
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

Robotic assistance through telepresence technology is an emerging area in aiding the visually impaired. By integrating the robotic perception of a remote environment and transferring it to a human user through haptic environmental feedback, the disabled user can increase one's capability to interact with remote environments through the telepresence robot. This paper presents a framework that integrates visual perception from heterogeneous vision sensors and enables real-time interactive haptic represent-ation of the real world through a mobile manipulation robotic system. Specifically, a set of multi-disciplinary algorithms such as stereo-vision processes, three-dimensional map building algorithms, and virtual-proxy haptic rendering processes are integrated into a unified framework to accomplish the goal of real-world haptic exploration successfully. Results of our framework in an indoor environment are displayed, and its performances are analyzed. Quantitative results are provided along with qualitative results through a set of human subject testing. Our future work includes real-time haptic fusion of multi-modal environmental perception and more extensive human subject testing in a prolonged experimental design.