SurfaceFusion: unobtrusive tracking of everyday objects in tangible user interfaces
GI '08 Proceedings of graphics interface 2008
Activity recognition from interactions with objects using dynamic Bayesian network
Proceedings of the 3rd ACM International Workshop on Context-Awareness for Self-Managing Systems
A flexible sequence alignment approach on pattern mining and matching for human activity recognition
Expert Systems with Applications: An International Journal
Activity inference for rfid-based assisted living applications
Journal of Mobile Multimedia
Hi-index | 0.00 |
Computer vision-based articulated human motion tracking is attractive for many applications since it allows unobtrusive and passive estimation of people's activities. Although much progress has been made on human-only tracking, the visual tracking of people that interact with objects such as tools, products, packages, and devices is considerably more challenging. The wide variety of objects, their varying visual appearance, and their varying (and often small) size makes a vision-based understanding of person-object interactions very difficult. To alleviate this problem for at least some application domains, we propose a framework that combines visual human motion tracking with RFID based object tracking. We customized commonly available RFID technology to obtain orientation estimates of objects in the field of RFID emitter coils. The resulting fusion of visual human motion tracking and RFID-based object tracking enables the accurate estimation of high-level interactions between people and objects for application domains such as retail, home-care, workplace-safety, manufacturing and others.