User models in dialog systems
Learning internal representations by error propagation
Parallel distributed processing: explorations in the microstructure of cognition, vol. 1
COACH: a teaching agent that learns
Communications of the ACM
Multimedia document management: an anthropocentric approach
Information Processing and Management: an International Journal
Coding, Analysis, Interpretation, and Recognition of Facial Expressions
IEEE Transactions on Pattern Analysis and Machine Intelligence
Affective computing
An introduction to software agents
Software agents
Panel on affect and emotion in the user interface
IUI '98 Proceedings of the 3rd international conference on Intelligent user interfaces
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Recognizing Facial Expressions in Image Sequences Using Local Parameterized Models of Image Motion
International Journal of Computer Vision
The invisible computer
Motives for intelligent agents: computational scripts for emotion concepts
SCAI '97 Proceedings of the sixth Scandinavian conference on Artificial intelligence
Understanding subjectivity: an interactionist view
UM '99 Proceedings of the seventh international conference on User modeling
ELIZA—a computer program for the study of natural language communication between man and machine
Communications of the ACM
Back-Propagation: Theory, Architecture, and Applications
Back-Propagation: Theory, Architecture, and Applications
Facial Expression Recognition Using a Neural Network
Proceedings of the Eleventh International Florida Artificial Intelligence Research Society Conference
The Cyborg's Dilemma: Embodiment in Virtual Environments
CT '97 Proceedings of the 2nd International Conference on Cognitive Technology (CT '97)
Computer-Aided Language Processing: Using Interpretation to Redefine Man-Machine Relations
CT '97 Proceedings of the 2nd International Conference on Cognitive Technology (CT '97)
Making internal processes external for constructive collaboration
CT '97 Proceedings of the 2nd International Conference on Cognitive Technology (CT '97)
A neural network model of micro- and macroprosody
A neural network model of micro- and macroprosody
Speech dialogue with facial displays: multimodal human-computer conversation
ACL '94 Proceedings of the 32nd annual meeting on Association for Computational Linguistics
Human-Computer Interaction
User Modeling and User-Adapted Interaction
Mining Multimedia Subjective Feedback
Journal of Intelligent Information Systems
A Hierarchical Model to Support Kansei Mining Process
IDEAL '02 Proceedings of the Third International Conference on Intelligent Data Engineering and Automated Learning
Developing multimodal intelligent affective interfaces for tele-home health care
International Journal of Human-Computer Studies - Application of affective computing in humanComputer interaction
K-DIME: An Affective Image Filtering System
IEEE MultiMedia
Gaze-X: adaptive affective multimodal interface for single-user office scenarios
Proceedings of the 8th international conference on Multimodal interfaces
Multimodal human-computer interaction: A survey
Computer Vision and Image Understanding
Using noninvasive wearable computers to recognize human emotions from physiological signals
EURASIP Journal on Applied Signal Processing
Recording lying, cheating, and defiance in an Internet Based Simulated Environment
Computers in Human Behavior
Multidimensional Emotional Appraisal Semantic Space (MEAS): Evaluating HM Affective Interactions
KES '07 Knowledge-Based Intelligent Information and Engineering Systems and the XVII Italian Workshop on Neural Networks on Proceedings of the 11th International Conference
Automatic Detection of Learner's Affect From Gross Body Language
Applied Artificial Intelligence
Affectively Intelligent User Interfaces for Enhanced E-Learning Applications
HCD 09 Proceedings of the 1st International Conference on Human Centered Design: Held as Part of HCI International 2009
Image and Vision Computing
Environments to support context and emotion aware visual interaction
Journal of Visual Languages and Computing
Affectively intelligent and adaptive car interfaces
Information Sciences: an International Journal
Gaze-X: adaptive, affective, multimodal interface for single-user office scenarios
ICMI'06/IJCAI'07 Proceedings of the ICMI 2006 and IJCAI 2007 international conference on Artifical intelligence for human computing
Estendendo o conhecimento afetivo da EmotionML
Proceedings of the IX Symposium on Human Factors in Computing Systems
ACII'11 Proceedings of the 4th international conference on Affective computing and intelligent interaction - Volume Part I
ACII'05 Proceedings of the First international conference on Affective Computing and Intelligent Interaction
The Rovereto Emotion and Cooperation Corpus: a new resource to investigate cooperation and emotions
Language Resources and Evaluation
An open source usability maturity model (OS-UMM)
Computers in Human Behavior
What Does Touch Tell Us about Emotions in Touchscreen-Based Gameplay?
ACM Transactions on Computer-Human Interaction (TOCHI)
Upper body pose recognition and classifier
Proceedings of the 5th ACM COMPUTE Conference: Intelligent & scalable system technologies
Hi-index | 0.00 |
With the growing importance of information technology in our everyday life, new types of applications are appearing that require the understanding of information in a broad sense. Information that includes affective and subjective content plays a major role not only in an individual's cognitive processes but also in an individual's interaction with others. We identify three key points to be considered when developing systems that capture affective information: embodiment (experiencing physical reality), dynamics (mapping experience and emotional state with its label) and adaptive interaction (conveying emotive response, responding to a recognized emotional state). We present two computational systems that implement those principles: MOUE (Model Of User Emotions) is an emotion recognition system that recognizes the user's emotion from his/her facial expressions, and from it, adaptively builds semantic definitions of emotion concepts using the user's feedback; MIKE (Multimedia Interactive Environment for Kansei communication) is an interactive adaptive system that, along with the user, co-evolves a language for communicating over subjective impressions.