How might people interact with agents
Communications of the ACM
Affective computing
Toward Machine Emotional Intelligence: Analysis of Affective Physiological State
IEEE Transactions on Pattern Analysis and Machine Intelligence - Graph Algorithms and Computer Vision
Modeling Multimodal Expression of User's Affective Subjective Experience
User Modeling and User-Adapted Interaction
Informing the Detection of the Students' Motivational State: An Empirical Study
ITS '02 Proceedings of the 6th International Conference on Intelligent Tutoring Systems
ICALT '01 Proceedings of the IEEE International Conference on Advanced Learning Technologies
Designing the User Interface: Strategies for Effective Human-Computer Interaction (4th Edition)
Designing the User Interface: Strategies for Effective Human-Computer Interaction (4th Edition)
Data Mining: Practical Machine Learning Tools and Techniques, Second Edition (Morgan Kaufmann Series in Data Management Systems)
Toward an Affect-Sensitive AutoTutor
IEEE Intelligent Systems
International Journal of Artificial Intelligence in Education
Responding to Learners' Cognitive-Affective States with Supportive and Shakeup Dialogues
Proceedings of the 13th International Conference on Human-Computer Interaction. Part III: Ubiquitous and Intelligent Interaction
User Modeling and User-Adapted Interaction
Automatic analysis of affective postures and body motion to detect engagement with a game companion
Proceedings of the 6th international conference on Human-robot interaction
Mood recognition based on upper body posture and movement features
ACII'11 Proceedings of the 4th international conference on Affective computing and intelligent interaction - Volume Part I
Investigating acoustic cues in automatic detection of learners' emotion from auto tutor
ACII'11 Proceedings of the 4th international conference on Affective computing and intelligent interaction - Volume Part II
Subliminally enhancing self-esteem: impact on learner performance and affective state
ITS'10 Proceedings of the 10th international conference on Intelligent Tutoring Systems - Volume Part II
Detecting learner frustration: towards mainstream use cases
ITS'10 Proceedings of the 10th international conference on Intelligent Tutoring Systems - Volume Part II
Theoretical model for interplay between some learning situations and brainwaves
ITS'10 Proceedings of the 10th international conference on Intelligent Tutoring Systems - Volume Part II
A time for emoting: when affect-sensitivity is and isn't effective at promoting deep learning
ITS'10 Proceedings of the 10th international conference on Intelligent Tutoring Systems - Volume Part I
Implicit strategies for intelligent tutoring systems
ITS'12 Proceedings of the 11th international conference on Intelligent Tutoring Systems
Visualization of student activity patterns within intelligent tutoring systems
ITS'12 Proceedings of the 11th international conference on Intelligent Tutoring Systems
Mining for motivation: using a single wearable accelerometer to detect people's interests
Proceedings of the 2nd ACM international workshop on Interactive multimedia on mobile and portable devices
ACM Transactions on Interactive Intelligent Systems (TiiS) - Special issue on highlights of the decade in interactive intelligent systems
Proceedings of the 31st European Conference on Cognitive Ergonomics
Discriminative functional analysis of human movements
Pattern Recognition Letters
Proceedings of the Fourth International Conference on Learning Analytics And Knowledge
Hi-index | 0.00 |
We explored the reliability of detecting learners' affect by monitoring their gross body language (body position and arousal) during interactions with an intelligent tutoring system called AutoTutor. Training and validation data on affective states were collected in a learning session with AutoTutor, after which the learners' affective states (i.e., emotions) were rated by the learner, a peer, and two trained judges. An automated body pressure measurement system was used to capture the pressure exerted by the learner on the seat and back of a chair during the tutoring session. We extracted two sets of features from the pressure maps. The first set focused on the average pressure exerted, along with the magnitude and direction of changes in the pressure during emotional experiences. The second set of features monitored the spatial and temporal properties of naturally occurring pockets of pressure. We constructed five data sets that temporally integrated the affective judgments with the two sets of pressure features. The first four datasets corresponded to judgments of the learner, a peer, and two trained judges, whereas the final data set integrated judgments of the two trained judges. Machine-learning experiments yielded affect detection accuracies of 73%, 72%, 70%, 83%, and 74%, respectively (chance = 50%) in detecting boredom, confusion, delight, flow, and frustration, from neutral. Accuracies involving discriminations between two, three, four, and five affective states (excluding neutral) were 71%, 55%, 46%, and 40% with chance rates being 50%, 33%, 25%, and 20%, respectively.