Learning to interpret pointing gestures with a time-of-flight camera
Proceedings of the 6th international conference on Human-robot interaction
Mobile robot localization through identifying spatial relations from detected corners
IWINAC'11 Proceedings of the 4th international conference on Interplay between natural and artificial computation: new challenges on bioinspired applications - Volume Part II
Robotics and Autonomous Systems
Johnny: An Autonomous Service Robot for Domestic Environments
Journal of Intelligent and Robotic Systems
Mobile robot map building from time-of-flight camera
Expert Systems with Applications: An International Journal
RGB-D mapping: Using Kinect-style depth cameras for dense 3D modeling of indoor environments
International Journal of Robotics Research
A study of a soft computing based method for 3D scenario reconstruction
Applied Soft Computing
GPGPU implementation of growing neural gas: Application to 3D scene reconstruction
Journal of Parallel and Distributed Computing
Information-gain view planning for free-form object reconstruction with a 3d ToF camera
ACIVS'12 Proceedings of the 14th international conference on Advanced Concepts for Intelligent Vision Systems
Noise modelling and uncertainty propagation for TOF sensors
ECCV'12 Proceedings of the 12th international conference on Computer Vision - Volume Part III
Comparing ICP variants on real-world data sets
Autonomous Robots
ICVS'13 Proceedings of the 9th international conference on Computer Vision Systems
Real-time plane segmentation using RGB-D cameras
Robot Soccer World Cup XV
International Journal of Computational Vision and Robotics
Hi-index | 0.00 |
This article investigates the use of time-of-flight (ToF) cameras in mapping tasks for autonomous mobile robots, in particular in simultaneous localization and mapping (SLAM) tasks. Although ToF cameras are in principle an attractive type of sensor for three-dimensional (3D) mapping owing to their high rate of frames of 3D data, two features make them difficult as mapping sensors, namely, their restricted field of view and influences on the quality of range measurements by high dynamics in object reflectivity; in addition, currently available models suffer from poor data quality in a number of aspects. The paper first summarizes calibration and filtering approaches for improving the accuracy, precision, and robustness of ToF cameras independent of their intended usage. Then, several ego motion estimation approaches are applied or adapted, respectively, in order to provide a performance benchmark for registering ToF camera data. As a part of this, an extension to the iterative closest point algorithm has been developed that increases the robustness under restricted field of view and under larger displacements. Using an indoor environment, the paper provides results from SLAM experiments using these approaches in comparison. It turns out that the application of ToF cameras is feasible to SLAM tasks, although this type of sensor has a complex error characteristic. © 2009 Wiley Periodicals, Inc.