Spatial-geometric approach to physical mobile interaction based on accelerometer and IR sensory data fusion

  • Authors:
  • Abu Saleh Md Mahfujur Rahman;M Anwar Hossain;Abdulmotaleb El Saddik

  • Affiliations:
  • University of Ottawa, ON, Canada;University of Ottawa, King Saud University, ON, Canada;University of Ottawa, ON, Canada

  • Venue:
  • ACM Transactions on Multimedia Computing, Communications, and Applications (TOMCCAP)
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

Interaction with the physical environment using mobile phones has become increasingly desirable and feasible. Nowadays mobile phones are being used to control different devices and access information/services related to those devices. To facilitate such interaction, devices are usually marked with RFID tags or visual markers, which are read by a mobile phone equipped with an integrated RFID reader or camera to fetch related information about those objects and initiate further actions. This article contributes in this domain of mobile physical interaction; however, using a spatial-geometric approach for interacting with indoor physical objects and artifacts instead of RFID based solutions. Using this approach, a mobile phone can point from a distance to an annotated object or a spatial subregion of that object for the purpose of interaction. The pointing direction and location is determined based on the fusion of IR camera and accelerometer data, where the IR cameras are used to calculate the 3D position of the mobile phone users and the accelerometer in the phone provides its tilting and orientation information. The annotation of objects and their subregions with which the mobile phone interacts is performed by specifying their geometric coordinates and associating related information or services with them. We perform experiment in a technology-augmented smart space and show the applicability and potential of the proposed approach.