Gesture-based interaction: a new dimension for mobile user interfaces

  • Authors:
  • Yang Li

  • Affiliations:
  • Google Research, Amphitheatre Parkway, Mountain View, CA

  • Venue:
  • Proceedings of the International Working Conference on Advanced Visual Interfaces
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

Today, smart phones with touchscreens and sensors are the predominant, fastest growing class of consumer computing devices. However, because these devices are used in diverse situations, and have unique capabilities and form factors, they also raise new user interface challenges, and at the same time, offer great opportunities for impactful HCI research. In this talk, I will focus on gesture-based interaction, an important interaction behavior enabled by touchscreens and built-in sensors, which sets mobile interaction apart from traditional graphical user interfaces. I will first talk about gesture shortcuts in the context of Gesture Search [1], a tool that allows users to quickly access applications and data on the phone by simply drawing a few gestures (http://www.google.com/mobile/gesture-search). Gesture Search flattens mobile phones' UI hierarchy by alleviating the need for navigating the interface. Gesture Search has been released and is invoked hundreds of thousands of times per day by a large user population. I will then cover several related projects that furthered our investigation into gesture shortcuts, including using gestures for target acquisition [3], crowd sourcing-based gesture recognition [5] and our early exploration on motion gestures [4, 6, 7]. Finally, I will turn to discuss multi-touch gestures for direct manipulation of an interface, the dominant class of gesture-based interaction on existing commercial devices. Multi-touch gestures are intuitive and efficient to use, but can be difficult to implement. I will discuss tools to support developers, allowing them to more easily create multi-touch interaction behaviors by demonstration [2]. These projects investigated various aspects of gesture-based interaction on mobile devices. They help open a new dimension for mobile interaction.