Research Infrastructure for Interactive Human- and Autonomous Guidance

  • Authors:
  • Bérénice Mettler;Navid Dadkhah;Zhaodan Kong;Jonathan Andersh

  • Affiliations:
  • Interactive Guidance and Control Lab (IGCL), Department of Aerospace Engineering and Mechanics, University of Minnesota, Minneapolis, USA 55455;Interactive Guidance and Control Lab (IGCL), Department of Aerospace Engineering and Mechanics, University of Minnesota, Minneapolis, USA 55455;Interactive Guidance and Control Lab (IGCL), Department of Aerospace Engineering and Mechanics, University of Minnesota, Minneapolis, USA 55455;Department of Computer Science and Engineering, University of Minnesota, Minneapolis, USA 55455

  • Venue:
  • Journal of Intelligent and Robotic Systems
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper describes a research infrastructure setup to exercise and investigate guidance and control capabilities under human and autonomous control modalities. The lab facility is designed to implement tasks that emphasize agent-environment interactions. The overall goal is to characterize these interactions and to apply the gained knowledge to determine interaction models. These can then be used to design guidance and control algorithms as well as human---machine systems. The facility uses miniature rotorcraft as test vehicles with a Vicon motion tracking system and SensoMotoric gaze tracking system. The facility also includes a high-fidelity simulation system to support larger scale autonomy and teleoperation experiments. The simulation incorporates the software components and models of the key flight hardware and sensors. The software system was integrated around the Robotics Operating System (ROS) to support the heterogenous processes and data and allow easy system reconfiguration. The paper describes the research objectives, details of the hardware and software components and their integration, and concludes with a summary of the ongoing research enabled by the lab facility including.