Nonverbally smart user interfaces: postural and facial expression data in human computer interaction

  • Authors:
  • G. Susanne Bahr;Carey Balaban;Mariofanna Milanova;Howard Choe

  • Affiliations:
  • Florida Institute of Technology, Melbourne, Florida;University of Pittsburgh, Pittsburgh, Pennsylvania;University of Arkansas at Little Rock, Little Rock, Arkansas;Raytheon - Network Centric Systems, Plano, Texas

  • Venue:
  • UAHCI'07 Proceedings of the 4th international conference on Universal access in human-computer interaction: ambient interaction
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

We suggest that User Interfaces (UIs) can be designed to serve as cognitive tools based on a model of nonverbal human interaction. Smart User Interfaces (SUIs) have the potential to support the human user when and where appropriate and thus indirectly facilitate higher mental processes without the need for end-user programming or external actuation. Moreover, graphic nonverbally sensitive SUIs are expected to be less likely to interfere with ongoing activity and disrupt the user. We present two non-invasive methods to assess postural and facial expression components and propose a contextual analysis to guide SUI actuation and supportive action. The approach is illustrated in a possible redesign of the Microsoft helper agent "Clippit" ®.