The use of eye movements in human-computer interaction techniques: what you look at is what you get
ACM Transactions on Information Systems (TOIS) - Special issue on computer—human interaction
New technological windows into mind: there is more in eyes and brains for human-computer interaction
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
BC(eye): Combining Eye-Gaze Input with Brain-Computer Interaction
UAHCI '09 Proceedings of the 5th International on ConferenceUniversal Access in Human-Computer Interaction. Part II: Intelligent and Ubiquitous Interaction Environments
ACII'11 Proceedings of the 4th international conference on Affective computing and intelligent interaction - Volume Part II
Hi-index | 0.00 |
Tracking eye movements to control technical systems is becoming increasingly popular; the use of eye movements to direct a cursor in human-computer interaction (HCI) is particularly convenient and caters for both healthy and disabled users alike. However, it is often difficult to find an appropriate substitute for the click operation, especially within the context of hands-free interaction. The most common approach is the use of dwell-times, but this can lead to the so-called "Midas-Touch" problem. This problem is defined by the fact that the system incorrectly interprets fixations due to long processing times or spontaneous dwellings as a user command. The current study explores the event-related potentials (ERPs) that might indicate a user's intention to select. Therefore, Electroencephalography (EEG) data was recorded from 10 participants during an interaction with a dwell-time system within a selection process. The aim was to identify EEG potentials related to the intention to interact (i.e. the selection of targets on a screen) and to classify these against EEG potentials unrelated to interaction during random fixations on the screen. As a result, we found a clear negativity over parietal electrodes for the intention of item selection. This negativity did not occur when participant fixated an object without intention to select (no specific intention). We robustly could classify the underlying brain activity in most of our participants with an average accuracy of 81%. The presented study provides evidence that the intention to interact evokes EEG activity that can clearly be detected by passive BCI technology. This leads to a new type of implicit interaction that holds the potential to improve human-machine interaction by increasing efficiency and making it more intuitive.