Code in the air: simplifying sensing on smartphones

  • Authors:
  • Tim Kaler;John Patrick Lynch;Timothy Peng;Lenin Ravindranath;Arvind Thiagarajan;Hari Balakrishnan;Sam Madden

  • Affiliations:
  • MIT Computer Science and Artificial Intelligence Laboratory;MIT Computer Science and Artificial Intelligence Laboratory;MIT Computer Science and Artificial Intelligence Laboratory;MIT Computer Science and Artificial Intelligence Laboratory;MIT Computer Science and Artificial Intelligence Laboratory;MIT Computer Science and Artificial Intelligence Laboratory;MIT Computer Science and Artificial Intelligence Laboratory

  • Venue:
  • Proceedings of the 8th ACM Conference on Embedded Networked Sensor Systems
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

Modern smartphones are equipped with a wide variety of sensors including GPS, WiFi and cellular radios capable of positioning, accelerometers, magnetic compasses and gyroscopes, light and proximity sensors, and cameras. These sensors have made smartphones an attractive platform for collaborative sensing (aka crowdsourcing) applications where phones cooperatively collect sensor data to perform various tasks. Researchers and mobile application developers have developed a wide variety of such applications. Examples of such systems include BikeTastic [4] and BikeNet [1] which allow bicyclists to collaboratively map and visualize biking trails, SoundSense [3] for collecting and analyzing microphone data, iCartel [2] which crowdsources driving tracks from users to monitor road traffic in real time, and Transitgenie [5], which cooperatively tracks buses and trains.