Human computer intelligent interaction using augmented cognition and emotional intelligence

  • Authors:
  • Jim X. Chen;Harry Wechsler

  • Affiliations:
  • Computer Science Department, George Mason University, Fairfax, VA;Computer Science Department, George Mason University, Fairfax, VA

  • Venue:
  • ICVR'07 Proceedings of the 2nd international conference on Virtual reality
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

Human-Computer Interaction (HCI) has mostly developed along two competing methodologies: direct manipulation and intelligent agents. Other possible but complementary methodologies are those of augmented cognition and affective computing and their adaptive combination. Augmented cognition harnesses computation to exploit explicit or implicit knowledge about context, mental state, and motivation for the user, while affective computing provides the means to recognize emotional intelligence and affects human-computer interfaces and interactions people are engaged with. Most HCI studies elicit emotions in relatively simple settings, whereas augmented cognition and affective computing include bodily (physical) embedded within mental (cognitive) and emotional events. Recognition of affective states currently focuses on their physical form (e.g., blinking or face distortions underlying human emotions) rather than implicit behavior and function (their impact on how the user employs the interface or communicates with others). Augmented cognition and affective computing are examined throughout this paper regarding design, implementation, and benefits. Towards that end we have designed an HCII interface that diagnoses and predicts whether the user was fatigued, confused, frustrated, momentarily distracted, or even alive through non-verbal information, namely paralanguage, in a virtual reality (VR) learning environment.