Event-based experiments in an assistive environment using wireless sensor networks and voice recognition

  • Authors:
  • Eric Becker;Zhengyi Le;Kyungseo Park;Yong Lin;Fillia Makedon

  • Affiliations:
  • University of Texas at Arlington;University of Texas at Arlington;University of Texas at Arlington;University of Texas at Arlington;University of Texas at Arlington

  • Venue:
  • Proceedings of the 2nd International Conference on PErvasive Technologies Related to Assistive Environments
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

As the population is aging, more and more people require additional health care, either at home, in the work place or in a nursing facility. Now, a need exists for health monitoring outside of hospital conditions. These new conditions make this technology of interest for developing health care monitoring systems that can be deployed in many different environments, including the home. Other systems in development employ a wide range of different sensors, including cameras, and recording the information for processing. These systems all involve using an apartment environment seeded with sensors for detecting human behavior and activities. While these systems are embedded in assistive environments, they do not have a comprehensive approach to describe events, or handle a general and rapid deployment into different configurations using wireless technology. In this paper, we are presenting our ongoing project of deploying sensors into an assistive environment. We currently are using SunSPOT sensor motes, where each one has been programmed for a specific role based on rules describing events. In addition, we are developing a voice recognition system for reaction to human input in the same environment. Our system can be rapidly deployed without requiring additional wiring or unwanted intrusion into the human patient's life.