Deploying and evaluating a mixed reality mobile treasure hunt: Snap2Play

  • Authors:
  • Yilun You;Tat Jun Chin;Joo Hwee Lim;Jean-Pierre Chevallet;Céline Coutrix;Laurence Nigay

  • Affiliations:
  • Institute for Infocomm Research, Singapore;Institute for Infocomm Research, Singapore;Institute for Infocomm Research, Singapore;Institute for Infocomm Research, Singapore;Laboratoire d'Informatique de Grenoble;Laboratoire d'Informatique de Grenoble

  • Venue:
  • Proceedings of the 10th international conference on Human computer interaction with mobile devices and services
  • Year:
  • 2008

Quantified Score

Hi-index 0.00

Visualization

Abstract

With the current trend, we can anticipate that future mobile phones will have ever-increasing computational power and be able to embed several captors/effectors including cameras, GPS, orientation sensors, tactile surfaces and vibro-tactile display. Such powerful mobile platforms enable us to deploy mixed reality systems. Many studies on mobile mixed reality focus on games. In this paper, we describe the deployment and a user study of a mixed reality location-based mobile treasure hunt, Snap2Play[1], using technologies such as place recognition, accelerometers and GPS tracking for enhancing the interaction with the game and therefore the game playability. The game that we deployed and tested is running on an off-the-shelf camera phone.