Integrating human and computer vision with EEG toward the control of a prosthetic arm

  • Authors:
  • Eugene Lavely;Geoffrey Meltzner;Rick Thompson

  • Affiliations:
  • BAE Systems, Burlington, MA, USA;BAE Systems, Burlington, MA, USA;BAE Systems, Burlington, MA, USA

  • Venue:
  • HRI '12 Proceedings of the seventh annual ACM/IEEE international conference on Human-Robot Interaction
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

We are undertaking the development of a brain computer interface (BCI) [1] for control of an upper limb prosthetic. Our approach exploits electrical neural activity data for motor intent estimation, and eye gaze direction for target selection. These data streams are augmented by computer vision (CV) for 3D scene reconstruction, and are integrated with a hierarchical controller to achieve semi-autonomous control. User interfaces for the effective control of the many degrees of freedom (DOF) of advanced prosthetic arms are not yet available [2]. Ideally the combined arm and interface technology provides the user with reliable and dexterous capability for reaching, grasping and fine-scale manipulation. Technologies that improve arm embodiment i.e., the impression by the amputee subject that the arm is a natural part of their body-concept presents an important and difficult challenge to the human-robot interaction research community. Such embodiment is clearly predicated on cross-disciplinary advances, including accurate intent estimation and an and an algorithmic basis for natural arm control.