SurroundSense: mobile phone localization using ambient sound and light

  • Authors:
  • Martin Azizyan;Romit Roy Choudhury

  • Affiliations:
  • Duke University, Durham, NC;Duke University, Durham, NC

  • Venue:
  • ACM SIGMOBILE Mobile Computing and Communications Review
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

Proliferating mobile phones provide a foundation for revolutionary innovations in peoplecentric computing. Numerous applications are on the rise, many of which exploit the phone's location as the primary indicator of context. We argue that existing physical localization schemes based on GPS/WiFi/GSM have limitations which make them impractical for use in such applications. Instead, in this poster we describe a means of localization where phones sense their surroundings, and use this ambient information to classify their location. Put differently, we postulate that different surroundings have photo-acoustic fingerprints, that can be sensed and used for localization. We demonstrate the feasibility using Tmote Invent motes that have light and sound sensors. Our ongoing work is extending SurroundSense to the mobile phone platform, and exploiting additional sensors (such as accelerometers and compasses) towards even better localization.