Multi-modal interfaces for control of assistive robotic devices

  • Authors:
  • Christopher Dale McMurrough

  • Affiliations:
  • The University of Texas at Arlington, Arlington, TX, USA

  • Venue:
  • Proceedings of the 14th ACM international conference on Multimodal interaction
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper presents an outline of dissertation research activities which aim to advance the use of non-traditional, multimodal interfaces in assistive robotic devices. The data modalities which are of particular interest in the work are perception of the environment using 3D scanning and computer vision, estimation of the user point of gaze, and perception of user intent during interaction with objects of interest. The main goal of this research is to explore the hypothesis that the combination of these data modalities can be used to provide intuitive and effective means of control over existing robotic platforms, such as wheelchairs and manipulators, to users with severe physical impairments.