Probabilistic reasoning in intelligent systems: networks of plausible inference
Probabilistic reasoning in intelligent systems: networks of plausible inference
Affective computing
Direct manipulation vs. interface agents
interactions
IEEE Transactions on Pattern Analysis and Machine Intelligence
Perceptual user interfaces: perceptual intelligence
Communications of the ACM
Computer Vision and Human-Computer Interaction
Computer Vision and Human-Computer Interaction
CoArt: Co-articulation Region Analysis for Control of 2D Characters
CA '02 Proceedings of the Computer Animation
MUVEES: a PC-based Multi-User Virtual Environment for Learning
VR '03 Proceedings of the IEEE Virtual Reality 2003
ICMI '02 Proceedings of the 4th IEEE International Conference on Multimodal Interfaces
Motion Estimation Using Statistical Learning Theory
IEEE Transactions on Pattern Analysis and Machine Intelligence
G-folds: an appearance-based model of facial gestures for performance driven facial animation
G-folds: an appearance-based model of facial gestures for performance driven facial animation
Reliable Face Recognition Methods: System Design, Implementation and Evaluation (International Series on Biometrics)
A Model for Understanding How Virtual Reality Aids Complex Conceptual Learning
Presence: Teleoperators and Virtual Environments
ACT-R: a theory of higher level cognition and its relation to visual attention
Human-Computer Interaction
Learning abstract concepts through interactive playing
Computers and Graphics
Hi-index | 0.00 |
Human-Computer Interaction (HCI) has mostly developed along two competing methodologies: direct manipulation and intelligent agents. Other possible but complementary methodologies are those of augmented cognition and affective computing and their adaptive combination. Augmented cognition harnesses computation to exploit explicit or implicit knowledge about context, mental state, and motivation for the user, while affective computing provides the means to recognize emotional intelligence and affects human-computer interfaces and interactions people are engaged with. Most HCI studies elicit emotions in relatively simple settings, whereas augmented cognition and affective computing include bodily (physical) embedded within mental (cognitive) and emotional events. Recognition of affective states currently focuses on their physical form (e.g., blinking or face distortions underlying human emotions) rather than implicit behavior and function (their impact on how the user employs the interface or communicates with others). Augmented cognition and affective computing are examined throughout this paper regarding design, implementation, and benefits. Towards that end we have designed an HCII interface that diagnoses and predicts whether the user was fatigued, confused, frustrated, momentarily distracted, or even alive through non-verbal information, namely paralanguage, in a virtual reality (VR) learning environment.