The media equation: how people treat computers, television, and new media like real people and places
Automatic Analysis of Facial Expressions: The State of the Art
IEEE Transactions on Pattern Analysis and Machine Intelligence
Analysis of emotion recognition using facial expressions, speech and multimodal information
Proceedings of the 6th international conference on Multimodal interfaces
IEEE Transactions on Pattern Analysis and Machine Intelligence
Measuring Emotional Wellbeing with a Non-intrusive Bed Sensor
INTERACT '09 Proceedings of the 12th IFIP TC 13 International Conference on Human-Computer Interaction: Part II
Towards affective learning with an EEG feedback approach
MTDL '09 Proceedings of the first ACM international workshop on Multimedia technologies for distance learning
Classification of EEG for Affect Recognition: An Adaptive Approach
AI '09 Proceedings of the 22nd Australasian Joint Conference on Advances in Artificial Intelligence
Emotion recognition from EEG using higher order crossings
IEEE Transactions on Information Technology in Biomedicine - Special section on affective and pervasive computing for healthcare
Ubiquitous awareness and intelligent solutions lab: Lanzhou University
Proceedings of the ACM 2011 conference on Computer supported cooperative work
EEG: a way to explore learner's affect in pervasive learning systems
GPC'10 Proceedings of the 5th international conference on Advances in Grid and Pervasive Computing
Computer Methods and Programs in Biomedicine
Hi-index | 0.00 |
Our project focused on recognizing emotion from human brain activity, measured by EEG signals. We have proposed a system to analyze EEG signals and classify them into 5 classes on two emotional dimensions, valence and arousal. This system was designed using prior knowledge from other research, and is meant to assess the quality of emotion recognition using EEG signals in practice. In order to perform this assessment, we have gathered a dataset with EEG signals. This was done by measuring EEG signals from people that were emotionally stimulated by pictures. This method enabled us to teach our system the relationship between the characteristics of the brain activity and the emotion. We found that the EEG signals contained enough information to separate five different classes on both the valence and arousal dimension. However, using a 3-fold cross validation method for training and testing, we reached classification rates of 32% for recognizing the valence dimension from EEG signals and 37% for the arousal dimension. Much better classification rates were achieved when using only the extreme values on both dimensions, the rates were 71% and 81%.