Evaluating User Interface of Multimodal Teaching Advisor Implemented on a Wearable Personal Computer

  • Authors:
  • Yoshimasa Yanagihara;Sinyo Muto;Takao Kakizaki

  • Affiliations:
  • NTT Cyber Solutions Laboratories, 3-9-11 Midori-cho, Musashino-shi, Tokyo 180-8585, JAPAN/ e-mail: yy@nttarm.hil.ntt.co.jp;NTT Cyber Solutions Laboratories, 3-9-11 Midori-cho, Musashino-shi, Tokyo 180-8585, JAPAN;NTT Cyber Solutions Laboratories, 3-9-11 Midori-cho, Musashino-shi, Tokyo 180-8585, JAPAN

  • Venue:
  • Journal of Intelligent and Robotic Systems
  • Year:
  • 2001

Quantified Score

Hi-index 0.01

Visualization

Abstract

A multimodal teaching advisor (MTA) that utilizes the work-site operator's know-how and robotic-system information, including that obtained from sensors, in a complementary manner has been enhanced and implemented on a wearable personal computer (WPC) for use with sensor-enhanced robotic systems used in manufacturing. The MTA's software was enhanced to acquire and monitor sensory and robot-motion data. The MTA presents support information to the operator via graphical and speech user interfaces in the WPC. Experimental results for a spatial-path-tracking task performed using a laser range finder showed that the MTA's operation was greatly improved. In particular, the simultaneous provision of support information via both the graphical and speech interfaces shortened the time taken for the operator to teach a robot using a teaching pendant.