Multimodal motion guidance: techniques for adaptive and dynamic feedback

  • Authors:
  • Christian Schönauer;Kenichiro Fukushi;Alex Olwal;Hannes Kaufmann;Ramesh Raskar

  • Affiliations:
  • Vienna University of Technology, Vienna, Austria & Massachusettes Institute of Technology, Cambridge, MA, USA;Tokyo Institute of Technology, Tokyo, Japan & Massachusettes Institute of Technology, Cambridge, MA, USA;Massachusettes Institute of Technology, Cambridge, MA, USA;Vienna University of Technology, Vienna, Austria;Massachusettes Institute of Technology, Cambridge, MA, USA

  • Venue:
  • Proceedings of the 14th ACM international conference on Multimodal interaction
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

The ability to guide human motion through automatically generated feedback has significant potential for applications in areas, such as motor learning, human-computer interaction, telepresence, and augmented reality. This paper focuses on the design and development of such systems from a human cognition and perception perspective. We analyze the dimensions of the design space for motion guidance systems, spanned by technologies and human information processing, and identify opportunities for new feedback techniques. We present a novel motion guidance system, that was implemented based on these insights to enable feedback for position, direction and continuous velocities. It uses motion capture to track a user in space and guides using visual, vibrotactile and pneumatic actuation. Our system also introduces motion retargeting through time warping, motion dynamics and prediction, to allow more flexibility and adaptability to user performance.