Man-machine Cooperation for the Control of an Intelligent Powered Wheelchair
Journal of Intelligent and Robotic Systems
Introduction to Autonomous Mobile Robots
Introduction to Autonomous Mobile Robots
Integrating Human Inputs with Autonomous Behaviors on an Intelligent Wheelchair Platform
IEEE Intelligent Systems
Towards 3D Point cloud based object maps for household environments
Robotics and Autonomous Systems
A new efficiency-weighted strategy for continuous human/robot cooperation in navigation
IEEE Transactions on Systems, Man, and Cybernetics, Part A: Systems and Humans
IROS'09 Proceedings of the 2009 IEEE/RSJ international conference on Intelligent robots and systems
Hi-index | 0.00 |
The previous perception and control system of smart wheelchairs normally doesn't distinguish different objects and treats all objects as obstacles. Consequently it is hard to realize the object related navigation tasks such as furniture docking or door passage with interference from the obstacle avoidance behavior. In this article, a local 3D semantic map is built online using a low-cost RGB-D camera, which provides the semantic and geometrical data of the recognized objects to the shared control modules for user intention estimation, target selection, motion control, as well as parameters adjusting of weight optimization for addressing different target. With the object information provided by 3D semantic map, our control system can choose different behaviors according to user intention to implement object related navigation. A smart wheelchair prototype equipped with a Kinect is developed and tested in real environment. The experiments showed that the 3D semantic map-based shared control can effectively enhance the smart wheelchair's mobility, and improve the collaboration between the user and the smart wheelchair.