Socializing the intelligent tutor: bringing empathy to computer tutors
Learning Issues for Intelligent Tutoring Systems
Affective computing
Looking at People: Sensing for Ubiquitous and Wearable Computing
IEEE Transactions on Pattern Analysis and Machine Intelligence
Toward Machine Emotional Intelligence: Analysis of Affective Physiological State
IEEE Transactions on Pattern Analysis and Machine Intelligence - Graph Algorithms and Computer Vision
Being There: Putting Brain, Body, and World Together Again
Being There: Putting Brain, Body, and World Together Again
Multimodal Human Emotion/Expression Recognition
FG '98 Proceedings of the 3rd. International Conference on Face & Gesture Recognition
Multimodal affect recognition in learning environments
Proceedings of the 13th annual ACM international conference on Multimedia
Toward an Affect-Sensitive AutoTutor
IEEE Intelligent Systems
International Journal of Artificial Intelligence in Education
Multimodal human computer interaction: a survey
ICCV'05 Proceedings of the 2005 international conference on Computer Vision in Human-Computer Interaction
AutoTutor: A simulation of a human tutor
Cognitive Systems Research
Recognizing and Responding to Student Affect
Proceedings of the 13th International Conference on Human-Computer Interaction. Part III: Ubiquitous and Intelligent Interaction
Proceedings of the 2009 conference on Artificial Intelligence in Education: Building Learning Systems that Care: From Knowledge Representation to Affective Modelling
Exploring affective technologies for the classroom with the subtle stone
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
“Yes!”: using tutor and sensor data to predict moments of delight during instructional activities
UMAP'10 Proceedings of the 18th international conference on User Modeling, Adaptation, and Personalization
Social and caring tutors: ITS 2010 keynote addres
ITS'10 Proceedings of the 10th international conference on Intelligent Tutoring Systems - Volume Part I
ITS'10 Proceedings of the 10th international conference on Intelligent Tutoring Systems - Volume Part I
Hi-index | 0.00 |
We investigated the potential of automatic detection of a learner's affective states from posture patterns and dialogue features obtained from an interaction with AutoTutor, an intelligent tutoring system with conversational dialogue. Training and validation data were collected from the sensors in a learning session with AutoTutor, after which the affective states of the learner were rated by the learner, a peer, and two trained judges. Machine learning experiments with several standard classifiers indicated that the dialogue and posture features could individually discriminate between the affective states of boredom, confusion, flow (engagement), and frustration. Our results also indicate that a combination of the dialogue and posture features does improve classification accuracy. However, the incremental gains associated with the combination of the two sensors were not sufficient to exhibit superadditivity (i.e., performance superior to an additive combination of individual channel). Instead, the combination of posture and dialogue reflected a modest amount of redundancy among these channels.