Smart home care network using sensor fusion and distributed vision-based reasoning

  • Authors:
  • Ali Maleki Tabar;Arezou Keshavarz;Hamid Aghajan

  • Affiliations:
  • Wireless Sensor Networks Lab, Stanford, CA;Wireless Sensor Networks Lab, Stanford, CA;Wireless Sensor Networks Lab, Stanford, CA

  • Venue:
  • Proceedings of the 4th ACM international workshop on Video surveillance and sensor networks
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

A wireless sensor network employing multiple sensing and event detection modalities and distributed processing is proposed for smart home monitoring applications. Image sensing and vision-based reasoning are employed to verify and further analyze events reported by other sensors. The system has been developed to address the growing application domain in caregiving to the elderly and persons in need of monitored living, who care to live independently while enjoying the assurance of timely access to caregivers when needed. An example of sensed events is the accidental fall of the person under care. A wireless badge node acts as a bridge between the user and the network. The badge node provides user-centric event sensing functions such as detecting falls, and also provides a voice communication channel between the user and the caregiving center when the system detects an alert and dials the center. The voice connection is carried over an IEEE 802.15.4 radio link between the user badge and another node in the network that acts as a modem. Using signal strength measurements, the network nodes keep track of the approximate location of the user in the monitoring environment.The network also includes wall-mounted image sensor nodes, which are triggered upon detection of a fall to analyze their field-of-view and provide the caregiving center with further information about the user 's status. A description of the developed network and several examples of the vision-based reasoning algorithm are presented in the paper.