A passive brain-computer interface for supporting gaze-based human-machine interaction

  • Authors:
  • Janna Protzak;Klas Ihme;Thorsten Oliver Zander

  • Affiliations:
  • Team PhyPA, Berlin Institute of Technology, Berlin, Germany,Research Training Group Prometei, Berlin Institute of Technology, Berlin, Germany;Team PhyPA, Berlin Institute of Technology, Berlin, Germany,Department for Psychosomatic Medicine and Psychotherapy, University of Leipzig, Leipzig, Germany;Team PhyPA, Berlin Institute of Technology, Berlin, Germany,Biological Psychology and Neuroergonomics, Berlin Institute of Technology, Berlin, Germany

  • Venue:
  • UAHCI'13 Proceedings of the 7th international conference on Universal Access in Human-Computer Interaction: design methods, tools, and interaction techniques for eInclusion - Volume Part I
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

Tracking eye movements to control technical systems is becoming increasingly popular; the use of eye movements to direct a cursor in human-computer interaction (HCI) is particularly convenient and caters for both healthy and disabled users alike. However, it is often difficult to find an appropriate substitute for the click operation, especially within the context of hands-free interaction. The most common approach is the use of dwell-times, but this can lead to the so-called "Midas-Touch" problem. This problem is defined by the fact that the system incorrectly interprets fixations due to long processing times or spontaneous dwellings as a user command. The current study explores the event-related potentials (ERPs) that might indicate a user's intention to select. Therefore, Electroencephalography (EEG) data was recorded from 10 participants during an interaction with a dwell-time system within a selection process. The aim was to identify EEG potentials related to the intention to interact (i.e. the selection of targets on a screen) and to classify these against EEG potentials unrelated to interaction during random fixations on the screen. As a result, we found a clear negativity over parietal electrodes for the intention of item selection. This negativity did not occur when participant fixated an object without intention to select (no specific intention). We robustly could classify the underlying brain activity in most of our participants with an average accuracy of 81%. The presented study provides evidence that the intention to interact evokes EEG activity that can clearly be detected by passive BCI technology. This leads to a new type of implicit interaction that holds the potential to improve human-machine interaction by increasing efficiency and making it more intuitive.