Plug&touch: a mobile interaction solution for large display via vision-based hand gesture detection

  • Authors:
  • Lei Xu;Yikai Fang;Kongqiao Wang;Jiangwei Li

  • Affiliations:
  • Multimedia lab, Nokia Research Center, Beijing, China;Multimedia lab, Nokia Research Center, Beijing, China;Multimedia lab, Nokia Research Center, Beijing, China;Multimedia lab, Nokia Research Center, Beijing, China

  • Venue:
  • Proceedings of the 20th ACM international conference on Multimedia
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

Smart phones and external large displays are a superior combination for content sharing and multiple person interaction. However, the potential of such a combination is not fully explored due to the lack of effective and portable interaction solutions. To solve this problem, we present a novel mobile solution called Plug&Touch, which is very easy to be set up and enables mobile users to directly and naturally operate on the large display surface. Plug&Touch utilizes vision-based methods to detect the user's finger tip and then control user interface. The Local Binary Pattern asymmetric texture difference is proposed to extract the foreground region, and the coarse-to-fine intersection process is adopted to precisely localize the finger tip. Plug&Touch has been implemented on the Symbian3 platform with two defined gestures - click and scroll. Experimental results and user evaluation has shown the practicability of Plug&Touch.