Towards High-resolution Imaging from Underwater Vehicles

  • Authors:
  • Hanumant Singh;Chris Roman;Oscar Pizarro;Ryan Eustice;Ali Can

  • Affiliations:
  • Dept. of Applied Ocean Physics and Engineering, Woods Hole Oceanographic Institution, Woods Hole, MA 02543;Dept. of Applied Ocean Physics and Engineering, Woods Hole Oceanographic Institution, Woods Hole, MA 02543;Australian Center for Field Robotics, University of Sydney, Sydney 2006, Australia;Department of Naval Architecture & Marine Engineering, University of Michigan, Ann Arbor, MI 48109;GE Global Research, Niskayuna, NY 12309

  • Venue:
  • International Journal of Robotics Research
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

Large area mapping at high resolution underwater continues to be constrained by sensor-level environmental constraints and the mismatch between available navigation and sensor accuracy. In this paper, advances are presented that exploit aspects of the sensing modality, and consistency and redundancy within local sensor measurements to build high-resolution optical and acoustic maps that are a consistent representation of the environment. This work is presented in the context of real-world data acquired using autonomous underwater vehicles (AUVs) and remotely operated vehicles (ROVs) working in diverse applications including shallow water coral reef surveys with the Seabed AUV, a forensic survey of the RMS Titanic in the North Atlantic at a depth of 4100 m using the Hercules ROV, and a survey of the TAG hydrothermal vent area in the mid-Atlantic at a depth of 3600 m using the Jason II ROV. Specifically, the focus is on the related problems of structure from motion from underwater optical imagery assuming pose instrumented calibrated cameras. General wide baseline solutions are presented for these problems based on the extension of techniques from the simultaneous localization and mapping (SLAM), photogrammetric and the computer vision communities. It is also examined how such techniques can be extended for the very different sensing modality and scale associated with multi-beam bathymetric mapping. For both the optical and acoustic mapping cases it is also shown how the consistency in mapping can be used not only for better global mapping, but also to refine navigation estimates.