Interaction with large ubiquitous displays using camera-equipped mobile phones

  • Authors:
  • Seokhee Jeon;Jane Hwang;Gerard J. Kim;Mark Billinghurst

  • Affiliations:
  • Department of Computer Science and Engineering, POSTECH, Pohang, Korea;Image and Media Research Center, Korea Institute of Science and Technology, Seoul, Korea;College of Information and Communication, Korea University, Seoul, Korea;Human Interface Technology Laboratory NZ, University of Canterbury, Christchurch, New Zealand

  • Venue:
  • Personal and Ubiquitous Computing
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

In the ubiquitous computing environment, people will interact with everyday objects (or computers embedded in them) in ways different from the usual and familiar desktop user interface. One such typical situation is interacting with applications through large displays such as televisions, mirror displays, and public kiosks. With these applications, the use of the usual keyboard and mouse input is not usually viable (for practical reasons). In this setting, the mobile phone has emerged as an excellent device for novel interaction. This article introduces user interaction techniques using a camera-equipped hand-held device such as a mobile phone or a PDA for large shared displays. In particular, we consider two specific but typical situations (1) sharing the display from a distance and (2) interacting with a touch screen display at a close distance. Using two basic computer vision techniques, motion flow and marker recognition, we show how a camera-equipped hand-held device can effectively be used to replace a mouse and share, select, and manipulate 2D and 3D objects, and navigate within the environment presented through the large display.