Feature Selection: Evaluation, Application, and Small Sample Performance
IEEE Transactions on Pattern Analysis and Machine Intelligence
Walking by thinking: the brainwaves are crucial, not the muscles!
Presence: Teleoperators and Virtual Environments - Special issue: 8th annual international workshop on presence II
Navigating virtual reality by thought: what is it like?
Presence: Teleoperators and Virtual Environments
Controlling a Wheelchair Indoors Using Thought
IEEE Intelligent Systems
Steady-state VEP-based brain-computer interface control in an immersive 3D gaming environment
EURASIP Journal on Applied Signal Processing
Classifying EEG for brain computer interfaces using Gaussian processes
Pattern Recognition Letters
Computational Intelligence and Neuroscience - Brain-Computer Interfaces: Towards Practical Implementations and Potential Applications
Towards development of a 3-state self-paced brain-computer interface
Computational Intelligence and Neuroscience - Brain-Computer Interfaces: Towards Practical Implementations and Potential Applications
Navidget for Easy 3D Camera Positioning from 2D Inputs
3DUI '08 Proceedings of the 2008 IEEE Symposium on 3D User Interfaces
IEEE Transactions on Robotics - Special issue on rehabilitation robotics
IEEE Transactions on Signal Processing
Communication of digital cultural heritage in public spaces by the example of Roman cologne
EuroMed'10 Proceedings of the Third international conference on Digital heritage
Brain-computer interfaces for 3D games: hype or hope?
Proceedings of the 6th International Conference on Foundations of Digital Games
Proceedings of the 18th ACM symposium on Virtual reality software and technology
BCI-based navigation in virtual and real environments
IWANN'13 Proceedings of the 12th international conference on Artificial Neural Networks: advences in computational intelligence - Volume Part II
Experimental art with brain controlled interface
UAHCI'13 Proceedings of the 7th international conference on Universal Access in Human-Computer Interaction: design methods, tools, and interaction techniques for eInclusion - Volume Part I
Artificial Intelligence in Medicine
Hi-index | 0.00 |
Brain--computer interfaces (BCI) are interaction devices that enable users to send commands to a computer by using brain activity only. In this paper, we propose a new interaction technique to enable users to perform complex interaction tasks and to navigate within large virtual environments (VE) by using only a BCI based on imagined movements (motor imagery). This technique enables the user to send high-level mental commands, leaving the application in charge of most of the complex and tedious details of the interaction task. More precisely, it is based on points of interest and enables subjects to send only a few commands to the application in order to navigate from one point of interest to the other. Interestingly enough, the points of interest for a given VE can be generated automatically thanks to the processing of this VE geometry. As the navigation between two points of interest is also automatic, the proposed technique can be used to navigate efficiently by thoughts within any VE. The input of this interaction technique is a newly-designed self-paced BCI which enables the user to send three different commands based on motor imagery. This BCI is based on a fuzzy inference system with reject options. In order to evaluate the efficiency of the proposed interaction technique, we compared it with the state of the art method during a task of virtual museum exploration. The state of the art method uses low-level commands, which means that each mental state of the user is associated with a simple command such as turning left or moving forward in the VE. In contrast, our method based on high-level commands enables the user to simply select its destination, leaving the application performing the necessary movements to reach this destination. Our results showed that with our interaction technique, users can navigate within a virtual museum almost twice as fast as with low-level commands, and with nearly half the commands, meaning with less stress and more comfort for the user. This suggests that our technique enables efficient use of the limited capacity of current motor imagery-based BCI in order to perform complex interaction tasks in VE, opening the way to promising new applications.