Attention meter: a vision-based input toolkit for interaction designers

  • Authors:
  • Chia-Hsun Jackie Lee;Chiun-Yi Ian Jang;Ting-Han Daniel Chen;Jon Wetzel;Yang-Ting Bowbow Shen;Ted Selker

  • Affiliations:
  • MIT Media Laboratory, Cambridge, MA;National Chao Tung Univ., Hsin-Chu, Taiwan;National Chao Tung Univ., Hsin-Chu, Taiwan;MIT Media Laboratory, Cambridge, MA;National Chao Tung Univ., Hsin-Chu, Taiwan;MIT Media Laboratory, Cambridge, MA

  • Venue:
  • CHI '06 Extended Abstracts on Human Factors in Computing Systems
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper shows how a software toolkit can allow graphic designers to make camera-based interactive environments in a short period of time without experience in user interface design or machine vision. The Attention Meter, a vision-based input toolkit, gives users an analysis of faces found in a given image stream, including facial expression, body motion, and attentive activities. This data is fed to a text file that can be easily understood by humans and programs alike. A four day workshop demonstrated that some Flash-savvy architecture students could construct interactive spaces (e.g. TaiKer-KTV and ScreamMarket) based on body and head motions.