Estimating uncertain spatial relationships in robotics
Autonomous robot vehicles
Introduction to Autonomous Mobile Robots
Introduction to Autonomous Mobile Robots
Visual Navigation for Mobile Robots: A Survey
Journal of Intelligent and Robotic Systems
KinectFusion: Real-time dense surface mapping and tracking
ISMAR '11 Proceedings of the 2011 10th IEEE International Symposium on Mixed and Augmented Reality
RGB-D mapping: Using Kinect-style depth cameras for dense 3D modeling of indoor environments
International Journal of Robotics Research
Mobile Robots Navigation in Indoor Environments Using Kinect Sensor
CBSEC '12 Proceedings of the 2012 Second Brazilian Conference on Critical Embedded Systems
International Journal of Robotics Research
Hi-index | 0.00 |
Localisation and mapping are the key requirements in mobile robotics to accomplish navigation. Frequently laser scanners are used, but they are expensive and only provide 2D mapping capabilities. In this paper we investigate the suitability of the Xbox Kinect optical sensor for navigation and simultaneous localisation and mapping. We present a prototype which uses the Kinect to capture 3D point cloud data of the external environment. The data is used in a 3D SLAM to create 3D models of the environment and localise the robot in the environment. By projecting the 3D point cloud into a 2D plane, we then use the Kinect sensor data for a 2D SLAM algorithm. We compare the performance of Kinect-based 2D and 3D SLAM algorithm with traditional solutions and show that the use of the Kinect sensor is viable. However, its smaller field of view and depth range and the higher processing requirements for the resulting sensor data limit its range of applications in practice.