When multi-touch meets streaming

  • Authors:
  • Zimu Liu;Yuan Feng;Baochun Li

  • Affiliations:
  • University of Toronto;University of Toronto;University of Toronto

  • Venue:
  • Proceedings of the 10th International Conference on Mobile and Ubiquitous Multimedia
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

With the advent of mobile devices with large displays, it is intuitive and natural for users to interact with an application on a mobile device using multi-touch gestures. In this paper, we propose that these multi-touch gestures can be streamed on-the-fly among multiple participating users, making it possible to engage users in a collaborative or competitive experience. Such multi-touch streams, featuring very low streaming bit rates, can be rendered on receivers to precisely reconstruct the states of an application. We present the challenges, system framework, embedded algorithm design, and real-world evaluation of TouchTime, a new system that has been designed from scratch to facilitate the streaming of multi-touch gestures among multiple users. By seamlessly combining local computation on mobile devices and services from the "cloud," we explore the design space of suitable mechanisms to represent and packetize multi-touch gestures, and of practical protocols to transport concurrent live multi-touch streams over the Internet. Specifically, we propose an auction-based reflector selection algorithm to achieve the minimal end-to-end delay in a live multi-touch streaming session. To demonstrate TouchTime, we have developed a new real-world music composition application --- called MusicScore --- using the Apple iPad Programming SDK, and used it as our running example and experimental testbed to evaluate our design choices and implementation of TouchTime.