Efficient Feature Selection via Analysis of Relevance and Redundancy
The Journal of Machine Learning Research
Using noninvasive wearable computers to recognize human emotions from physiological signals
EURASIP Journal on Applied Signal Processing
Short-term emotion assessment in a recall paradigm
International Journal of Human-Computer Studies
Implicit emotional tagging of multimedia using EEG signals and brain computer interface
WSM '09 Proceedings of the first SIGMM workshop on Social media
Queries and tags in affect-based multimedia retrieval
ICME'09 Proceedings of the 2009 IEEE international conference on Multimedia and Expo
EEG correlates of different emotional states elicited during watching music videos
ACII'11 Proceedings of the 4th international conference on Affective computing and intelligent interaction - Volume Part II
Affect recognition based on physiological changes during the watching of music videos
ACM Transactions on Interactive Intelligent Systems (TiiS) - Special Issue on Affective Interaction in Natural Environments
Putting humans in the loop: Social computing for Water Resources Management
Environmental Modelling & Software
Fusion of facial expressions and EEG for implicit affective tagging
Image and Vision Computing
Artificial Intelligence in Medicine
Investigation of fNIRS brain sensing as input to information filtering systems
Proceedings of the 4th Augmented Human International Conference
Hi-index | 0.00 |
Recently, the field of automatic recognition of users. affective states has gained a great deal of attention. Automatic, implicit recognition of affective states has many applications, ranging from personalized content recommendation to automatic tutoring systems. In this work, we present some promising results of our research in classification of emotions induced by watching music videos. We show robust correlations between users' self-assessments of arousal and valence and the frequency powers of their EEG activity. We present methods for single trial classification using both EEG and peripheral physiological signals. For EEG, an average (maximum) classification rate of 55.7% (67.0%) for arousal and 58.8% (76.0%) for valence was obtained. For peripheral physiological signals, the results were 58.9% (85.5%) for arousal and 54.2% (78.5%) for valence.