Learning Patterns of Activity Using Real-Time Tracking
IEEE Transactions on Pattern Analysis and Machine Intelligence
A Bayesian Computer Vision System for Modeling Human Interactions
IEEE Transactions on Pattern Analysis and Machine Intelligence
Panoptes: scalable low-power video sensor networking technologies
MULTIMEDIA '03 Proceedings of the eleventh ACM international conference on Multimedia
The case for multi--tier camera sensor networks
NOSSDAV '05 Proceedings of the international workshop on Network and operating systems support for digital audio and video
Cyclops: in situ image sensing and interpretation in wireless sensor networks
Proceedings of the 3rd international conference on Embedded networked sensor systems
A Second Generation Low Cost Embedded Color Vision System
CVPR '05 Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05) - Workshops - Volume 03
CVPRW '06 Proceedings of the 2006 Conference on Computer Vision and Pattern Recognition Workshop
Proceedings of the 6th international conference on Information processing in sensor networks
Autonomous multicamera tracking on embedded smart cameras
EURASIP Journal on Embedded Systems
Real-time foreground-background segmentation using codebook model
Real-Time Imaging
Cooperative object tracking and composite event detection with wireless embedded smart cameras
IEEE Transactions on Image Processing - Special section on distributed camera networks: sensing, processing, communication, and implementation
Hi-index | 0.00 |
With the introduction of battery-powered wireless embedded smart cameras, it has now become viable to deploy large numbers of spatially-distributed cameras with more flexibility in terms of camera locations. However, many challenges remain to be addressed to build operational, battery-powered, wireless smart-camera networks. Battery life is limited, and video processing tasks, such as foreground detection and tracking, consume considerable amount of energy. Thus, it is essential to design and implement lightweight algorithms and methods to increase the energy efficiency of each camera node, and thus the overall life-time of the camera network. We present an adaptive method based on tracking that significantly decreases the energy consumption of the embedded camera. The microprocessor on the camera board is sent to an idle state depending on the amount of activity in the scene. The amount of time the camera remains in idle mode is adaptively changed based on the speeds of tracked objects. Instead of continuously capturing and processing every frame, the camera drops frames during idle mode while preserving the tracking performance and thus system reliability at the same time. We present experimental results showing the energy-efficiency of the proposed method, and the gain in battery life. The proposed methodology provides 25% to 37% savings in the energy consumption, and 45:83% to 65% increase in the battery life depending on the number of objects in the scene and their speeds.