Nao robot localization and navigation using fusion of odometry and visual sensor data

  • Authors:
  • Šimon Fojtů;Michal Havlena;Tomáš Pajdla

  • Affiliations:
  • Center for Machine Perception, Department of Cybernetics, FEE, CTU in Prague, Prague 2, Czech Republic;Center for Machine Perception, Department of Cybernetics, FEE, CTU in Prague, Prague 2, Czech Republic;Center for Machine Perception, Department of Cybernetics, FEE, CTU in Prague, Prague 2, Czech Republic

  • Venue:
  • ICIRA'12 Proceedings of the 5th international conference on Intelligent Robotics and Applications - Volume Part II
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

Nao humanoid robot from Aldebaran Robotics is equipped with an odometry sensor providing rather inaccurate robot pose estimates. We propose using Structure from Motion (SfM) to enable visual odometry from Nao camera without the necessity to add artificial markers to the scene and show that the robot pose estimates can be significantly improved by fusing the data from the odometry sensor and visual odometry. The implementation consists of the sensor modules streaming robot data, the mapping module creating a 3D model, the visual localization module estimating camera pose w.r.t. the model, and the navigation module planning robot trajectories and performing the actual movement. All of the modules are connected through the RSB middleware, which makes the solution independent on the given robot type.