OnObject: gestural play with tagged everyday objects

  • Authors:
  • Keywon Chung;Michael Shilman;Chris Merrill;Hiroshi Ishii

  • Affiliations:
  • Massachusettes Institute of Technology, Cambridge, MA, USA;Chatterpop, San Francisco, CA, USA;Massachusettes Institute of Technology, Cambridge, MA, USA;Massachusettes Institute of Technology, Cambridge, MA, USA

  • Venue:
  • UIST '10 Adjunct proceedings of the 23nd annual ACM symposium on User interface software and technology
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

Many Tangible User Interface (TUI) systems employ sensor-equipped physical objects. However they do not easily scale to users' actual environments; most everyday objects lack the necessary hardware, and modification requires hardware and software development by skilled individuals. This limits TUI creation by end users, resulting in inflexible interfaces in which the mapping of sensor input and output events cannot be easily modified reflecting the end user's wishes and circumstances. We introduce OnObject, a small device worn on the hand, which can program physical objects to respond to a set of gestural triggers. Users attach RFID tags to situated objects, grab them by the tag, and program their responses to grab, release, shake, swing, and thrust gestures using a built-in button and a microphone. In this paper, we demonstrate how novice end users including preschool children can instantly create engaging gestural object interfaces with sound feedback from toys, drawings, or clay.