EventHurdle: supporting designers' exploratory interaction prototyping with gesture-based sensors

  • Authors:
  • Ju-Whan Kim;Tek-Jin Nam

  • Affiliations:
  • KAIST (Korea Advanced Institute of Science and Technology), Daejeon, Republic of Korea;KAIST (Korea Advanced Institute of Science and Technology), Daejeon, Republic of Korea

  • Venue:
  • Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
  • Year:
  • 2013

Quantified Score

Hi-index 0.01

Visualization

Abstract

Prototyping of gestural interactions in the early phase of design is one of the most challenging tasks for designers without advanced programming skills. Relating users' input from gesture-based sensor values requires a great deal of effort on the designer's part and disturbs their reflective and creative thinking. To deal with this problem, we present EventHurdle, a visual gesture-authoring tool to support designers' explorative prototyping. It supports remote gestures from a camera, handheld gestures with physical sensors, and touch gestures by utilizing touch screens. EventHurdle allows designers to visually define and modify gestures through interaction workspace and graphical markup language with hurdles. Because the created gestures can be integrated into a prototype as programming code and automatically recognized, designers do not need to pay attention in sensor-related implementation. Two user studies and a recognition test are reported to discuss the acceptance and implications of explorative prototyping tools for designers.