Towards multimodal human-robot interaction in large scale virtual environment

  • Authors:
  • Pierre Boudoin;Christophe Domingues;Samir Otmane;Nassima Ouramdane;Malik Mallem

  • Affiliations:
  • IBISC Laboratory - University of Evry, Evry, France;IBISC Laboratory - University of Evry, Evry, France;IBISC Laboratory - University of Evry, Evry, France;IBISC Laboratory - University of Evry, Evry, France;IBISC Laboratory - University of Evry, Evry, France

  • Venue:
  • Proceedings of the 3rd ACM/IEEE international conference on Human robot interaction
  • Year:
  • 2008

Quantified Score

Hi-index 0.00

Visualization

Abstract

Human Operators (HO) of telerobotics systems may be able to achieve complex operations with robots. Designing usable and effective Human-Robot Interaction (HRI) is very challenging for system developers and human factors specialists. The search for new metaphors and techniques for HRI adapted to telerobotics systems emerge on the conception of Multimodal HRI (MHRI). MHRI allows to interact naturally and easily with robots due to combination of many devices and an efficient Multimodal Management System (MMS). A system like this should bring a new user's experience in terms of natural interaction, usability, efficiency and flexibility to HRI system. So, a good management of multimodality is very. Moreover, the MMS must be transparent to user in order to be efficient and natural. Empirical evaluation is necessary to have an idea about the goodness of our MMS. We will use an Empirical Evaluation Assistant (EEA) designed in the IBISC laboratory. EEA permits to rapidly gather significant feedbacks about the usability of interaction during the development lifecycle. However the HRI would be classically evaluated by ergonomics experts at the end of its development lifecycle. Results from a preliminary evaluation on a robot teleoperation tasks using the ARITI software framework for assisting the user in piloting the robot, and the IBISC semi-immersive VR/AR platform EVR@, are given. They compare the use of a Flystick and Data Gloves for the 3D interaction with the robot. They show that our MMS is functional although multimodality used in our experiments is not sufficient to provide an efficient Human-Robot Interaction. The EVR@ SPIDAR force feedback will be integrated in our MMS to improve the user's efficiency.