Fusing multiple sensors information into mixed reality-based user interface for robot teleoperation

  • Authors:
  • Jie Zhu;Xiangyu Wang;Michael Rosenman

  • Affiliations:
  • Design Lab, Faculty of Architecture, Design and Planning, The University of Sydney, Sydney, NSW, Australia;Design Lab, Faculty of Architecture, Design and Planning, The University of Sydney, Sydney, NSW, Australia;Design Lab, Faculty of Architecture, Design and Planning, The University of Sydney, Sydney, NSW, Australia

  • Venue:
  • SMC'09 Proceedings of the 2009 IEEE international conference on Systems, Man and Cybernetics
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

Mixed Reality commonly refers to the merging of real and virtual worlds to produce new visualization environments where physical and digital objects co-exist and interact in real time. Mixed Reality can also be used for fusing sensor data into the existing user interface to efficiently improve situation awareness, to facilitate the understanding of surrounding environment, and to predict the future status. The work presented in this paper fuses and then represents the real video and complementary information into one single Mixed Reality interface. A simulation platform to test the Mixed Reality interface for teleoperation is also discussed in this paper.