Real-Time human tracker based location and motion recognition for the ubiquitous smart home

  • Authors:
  • Jonghwa Choi;Soonyong Choi; DongkyooShin;Dongil Shin

  • Affiliations:
  • Department of Computer Science and Engineering, Sejong University, Seoul, Korea;Department of Computer Science and Engineering, Sejong University, Seoul, Korea;Department of Computer Science and Engineering, Sejong University, Seoul, Korea;Department of Computer Science and Engineering, Sejong University, Seoul, Korea

  • Venue:
  • UIC'06 Proceedings of the Third international conference on Ubiquitous Intelligence and Computing
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

The ubiquitous smart home is the home of the future that takes advantage of context information from the human and the home environment and provides an automatic home service for the human. Human location and motion are the most important contexts in the ubiquitous smart home. We present a real-time human tracker that predicts human location and motion for the ubiquitous smart home. We used four network cameras for real-time human tracking. This paper explains the real-time human tracker’s architecture, and presents an algorithm with the details of two functions (prediction of human location and motion) in the real-time human tracker. The human location uses three kinds of background images (IMAGE1: empty room image, IMAGE2: image with furniture and home appliances in the home, IMAGE3: image with IMAGE2 and the human). The real-time human tracker decides whether the human is included with which furniture (or home appliance) through an analysis of three images, and predicts human motion using a support vector machine. A performance experiment of the human’s location, which uses three images, took an average of 0.037 seconds. The SVM’s feature of human’s motion recognition is decided from pixel number by array line of the moving object. We evaluated each motion 1000 times. The average accuracy of all the motions was found to be 86.5%.