Principles of multivariate analysis: a user's perspective
Principles of multivariate analysis: a user's perspective
The media equation: how people treat computers, television, and new media like real people and places
Affective computing
An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
Toward Machine Emotional Intelligence: Analysis of Affective Physiological State
IEEE Transactions on Pattern Analysis and Machine Intelligence - Graph Algorithms and Computer Vision
Machine Learning
Emotion Detection from Speech to Enrich Multimedia Content
PCM '01 Proceedings of the Second IEEE Pacific Rim Conference on Multimedia: Advances in Multimedia Information Processing
Automated Facial Expression Recognition Based on FACS Action Units
FG '98 Proceedings of the 3rd. International Conference on Face & Gesture Recognition
Developing multimodal intelligent affective interfaces for tele-home health care
International Journal of Human-Computer Studies - Application of affective computing in humanComputer interaction
FGR '02 Proceedings of the Fifth IEEE International Conference on Automatic Face and Gesture Recognition
Analysis of emotion recognition using facial expressions, speech and multimodal information
Proceedings of the 6th international conference on Multimodal interfaces
2006 Special issue: Mirror neurons and imitation: A computationally guided review
Neural Networks - 2006 Special issue: The brain mechanisms of imitation learning
Emotion recognition using brain activity
CompSysTech '08 Proceedings of the 9th International Conference on Computer Systems and Technologies and Workshop for PhD Students in Computing
Health-status monitoring through analysis of behavioral patterns
IEEE Transactions on Systems, Man, and Cybernetics, Part A: Systems and Humans
Support vector machines for histogram-based image classification
IEEE Transactions on Neural Networks
Multi-modal biometric emotion recognition using classifier ensembles
IEA/AIE'11 Proceedings of the 24th international conference on Industrial engineering and other applications of applied intelligent systems conference on Modern approaches in applied intelligence - Volume Part I
EEG-based personalized digital experience
UAHCI'11 Proceedings of the 6th international conference on Universal access in human-computer interaction: users diversity - Volume Part II
Real-time EEG-based emotion recognition and its applications
Transactions on computational science XII
Affect recognition based on physiological changes during the watching of music videos
ACM Transactions on Interactive Intelligent Systems (TiiS) - Special Issue on Affective Interaction in Natural Environments
ICONIP'11 Proceedings of the 18th international conference on Neural Information Processing - Volume Part I
Classification of primitive shapes using brain-computer interfaces
Computer-Aided Design
Emotion recognition using the emotiv EPOC device
ICONIP'12 Proceedings of the 19th international conference on Neural Information Processing - Volume Part V
Computers in Biology and Medicine
Hi-index | 0.00 |
Electroencephalogram (EEG)-based emotion recognition is a relatively new field in the affective computing area with challenging issues regarding the induction of the emotional states and the extraction of the features in order to achieve optimum classification performance. In this paper, a novel emotion evocation and EEG-based feature extraction technique is presented. In particular, the mirror neuron system concept was adapted to efficiently foster emotion induction by the process of imitation. In addition, higher order crossings (HOC) analysis was employed for the feature extraction scheme and a robust classification method, namely HOC-emotion classifier (HOC-EC), was implemented testing four different classifiers [quadratic discriminant analysis (QDA), k-nearest neighbor, Mahalanobis distance, and support vector machines (SVMs)], in order to accomplish efficient emotion recognition. Through a series of facial expression image projection, EEG data have been collected by 16 healthy subjects using only 3 EEG channels, namely Fp1, Fp2, and a bipolar channel of F3 and F4 positions according to 10-20 system. Two scenarios were examined using EEG data from a single-channel and from combined-channels, respectively. Compared with other feature extractionmethods, HOC-EC appears to outperform them, achieving a 62.3% (usingQDA) and 83.33% (usingSVM) classification accuracy for the single-channel and combined-channel cases, respectively, differentiating among the six basic emotions, i.e., happiness, surprise, anger, fear, disgust, and sadness. As the emotion class-set reduces its dimension, the HOC-EC converges toward maximum classification rate (100% for five or less emotions), justifying the efficiency of the proposed approach. This could facilitate the integration of HOC-EC in human machine interfaces, such as pervasive healthcare systems, enhancing their affective character and providing information about the user's emotional status (e.g., identifying user's emotion experiences, recurring affective states, time-dependent emotional trends).