Machine Learning
The media equation: how people treat computers, television, and new media like real people and places
Visual Interpretation of Hand Gestures for Human-Computer Interaction: A Review
IEEE Transactions on Pattern Analysis and Machine Intelligence
Toward Machine Emotional Intelligence: Analysis of Affective Physiological State
IEEE Transactions on Pattern Analysis and Machine Intelligence - Graph Algorithms and Computer Vision
Detecting Faces in Images: A Survey
IEEE Transactions on Pattern Analysis and Machine Intelligence
Proceedings of HCI International (the 8th International Conference on Human-Computer Interaction) on Human-Computer Interaction: Ergonomics and User Interfaces-Volume I - Volume I
To feel or not to feel: the role of affect in human-computer interaction
International Journal of Human-Computer Studies - Application of affective computing in humanComputer interaction
The production and recognition of emotions in speech: features and algorithms
International Journal of Human-Computer Studies - Application of affective computing in humanComputer interaction
Developing multimodal intelligent affective interfaces for tele-home health care
International Journal of Human-Computer Studies - Application of affective computing in humanComputer interaction
A Bayesian Approach to Joint Feature Selection and Classifier Design
IEEE Transactions on Pattern Analysis and Machine Intelligence
Affective multimodal human-computer interaction
Proceedings of the 13th annual ACM international conference on Multimedia
Neural Networks - Special issue: Emotion and brain
Using noninvasive wearable computers to recognize human emotions from physiological signals
EURASIP Journal on Applied Signal Processing
International Journal of Artificial Intelligence in Education
ICMI '08 Proceedings of the 10th international conference on Multimodal interfaces
Audiovisual laughter detection based on temporal features
ICMI '08 Proceedings of the 10th international conference on Multimodal interfaces
Multimodal recognition of personality traits in social interactions
ICMI '08 Proceedings of the 10th international conference on Multimodal interfaces
An integrative recognition method for speech and gestures
ICMI '08 Proceedings of the 10th international conference on Multimodal interfaces
Computational Intelligence and Neuroscience - Neuromath: advanced methods for the estimation of human brain activity and connectivity
IEEE Transactions on Information Technology in Biomedicine - Special section on affective and pervasive computing for healthcare
Emotion assessment: arousal evaluation using EEG's and peripheral physiological signals
MRCS'06 Proceedings of the 2006 international conference on Multimedia Content Representation, Classification and Security
A survey on visual surveillance of object motion and behaviors
IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews
Toward Emotion Recognition in Car-Racing Drivers: A Biosignal Processing Approach
IEEE Transactions on Systems, Man, and Cybernetics, Part A: Systems and Humans
Smartphones get emotional: mind reading images and reconstructing the neural sources
ACII'11 Proceedings of the 4th international conference on Affective computing and intelligent interaction - Volume Part II
Computer Methods and Programs in Biomedicine
Expert Systems with Applications: An International Journal
Towards an adaptive cultural heritage experience using physiological computing
CHI '13 Extended Abstracts on Human Factors in Computing Systems
Hi-index | 0.00 |
This paper proposes a methodology for the robust classification of neurophysiological data into four emotional states collected during passive viewing of emotional evocative pictures selected from the International Affective Picture System. The proposed classification model is formed according to the current neuroscience trends, since it adopts the independency of two emotional dimensions, namely arousal and valence, as dictated by the bidirectional emotion theory, whereas it is gender-specific.Atwo-step classification procedure is proposed for the discrimination of emotional states between EEG signals evoked by pleasant and unpleasant stimuli, which also vary in their arousal/intensity levels. The first classification level involves the arousal discrimination. The valence discrimination is then performed. The Mahalanobis (MD) distance-based classifier and support vector machines (SVMs) were used for the discrimination of emotions. The achieved overall classification rates were 79.5% and 81.3% for the MD and SVM, respectively, significantly higher than in previous studies. The robust classification of objective emotional measures is the first step toward numerous applications within the sphere of human-computer interaction.