Navigating a smart wheelchair with a brain-computer interface interpreting steady-state visual evoked potentials

  • Authors:
  • Christian Mandel;Thorsten Lüth;Tim Laue;Thomas Röfer;Axel Gräser;Bernd Krieg-Brückner

  • Affiliations:
  • Department of Mathematics and Computer Science, FB3, University of Bremen, Bremen, Germany;University of Bremen, Institute of Automation, Bremen, Germany;German Research Center for Artificial Intelligence, Research Group, Safe and Secure Cognitive Systems, Bremen, Germany;German Research Center for Artificial Intelligence, Research Group, Safe and Secure Cognitive Systems, Bremen, Germany;University of Bremen, Institute of Automation, Bremen, Germany;German Research Center for Artificial Intelligence, Research Group, Safe and Secure Cognitive Systems, Bremen, Germany

  • Venue:
  • IROS'09 Proceedings of the 2009 IEEE/RSJ international conference on Intelligent robots and systems
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

In order to allow severely disabled people who cannot move their arms and legs to steer an automated wheelchair, this work proposes the combination of a non-invasive EEG-based human-robot interface and an autonomous navigation system that safely executes the issued commands. The robust classification of steady-state visual evoked potentials in brain activity allows for the seamless projection of qualitative directional navigation commands onto a frequently updated route graph representation of the environment. The deduced metrical target locations are navigated to by the application of an extended version of the well-established Nearness Diagram Navigation method. The applicability of the system proposed is demonstrated by a real-world pilot study in which eight out of nine untrained subjects successfully navigated an automated wheelchair, requiring only some ten minutes of preparation.