IAMHear: a tabletop interface with smart mobile devices using acoustic location

  • Authors:
  • Seunghun Kim;Bongjun Kim;Woon Seung Yeo

  • Affiliations:
  • KAIST (Korea Advanced Institute of Science and Technology), Daejeon, South Korea;KAIST, Daejeon, South Korea;KAIST (Korea Advanced Institute of Science and Technology), Daejeon, South Korea

  • Venue:
  • CHI '13 Extended Abstracts on Human Factors in Computing Systems
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

IAMHear is a novel tabletop interface for music performance and sound making, in which smart mobile devices are used as on-table objects for interaction. Thanks to the advanced features of smart mobile devices, IAMHear is by nature multi-modal and highly interactive. The system also allows for acoustic location mechanism using virtually inaudible sound without any special sensors, making itself simpler in structure and easier to implement. In addition, use of "everyday objects" also evokes interaction by intuitive gestures such as placement, movement, and rotation. As a music sequencer, IAMHear enables the user to make music by placing objects on table; inspired by the idea of spectrographic mapping with virtual scan line, pitch and timbre of sounds are determined by the location/orientation of tabletop objects as well as ambient noise. We present IAMHear as a simple and novel alternative to interactive tabletop interface for music and various multimedia applications as well.