Robotic wheelchair based on observations of people using integrated sensors

  • Authors:
  • Yoshinori Kobayashi;Yuki Kinpara;Tomoo Shibusawa;Yoshinori Kuno

  • Affiliations:
  • Department of Information and Computer Sciences, Saitama University, Saitama city, Saitama, Japan;Department of Information and Computer Sciences, Saitama University, Saitama city, Saitama, Japan;Department of Information and Computer Sciences, Saitama University, Saitama city, Saitama, Japan;Department of Information and Computer Sciences, Saitama University, Saitama city, Saitama, Japan

  • Venue:
  • IROS'09 Proceedings of the 2009 IEEE/RSJ international conference on Intelligent robots and systems
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

Recently, several robotic/intelligent wheelchairs have been proposed that employ user-friendly interfaces or autonomous functions. Although it is often desirable for user to operate wheelchairs on their own, they are often accompanied by a caregiver or companion. In designing wheelchairs, it is important to reduce the caregiver load. In this paper we propose a robotic wheelchair that can move with a caregiver side by side. In contrast to a front-behind position, in a side-by-side position it is more difficult for wheelchairs to adjust when the caregiver makes a turn. To cope with this problem we present a visual-laser tracking technique. In this technique, a laser range sensor and an omni-directional camera are integrated to observe the caregiver. A Rao-Blackwellized particle filter framework is employed to track the caregiver's position and orientation of both body and head based on the distance data and panorama images captured from the laser range sensor and the omni-directional camera. After presenting this technique, we introduce an application of the wheelchair for museum visit use.