Sensor networks for media production
SenSys '04 Proceedings of the 2nd international conference on Embedded networked sensor systems
SEVA: sensor-enhanced video annotation
Proceedings of the 13th annual ACM international conference on Multimedia
Design and implementation of a wireless sensor network for intelligent light control
Proceedings of the 6th international conference on Information processing in sensor networks
Attack vs. failure detection in event-driven wireless visual sensor networks
Proceedings of the 9th workshop on Multimedia & security
CareLog: a selective archiving tool for behavior management in schools
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
SEVA: Sensor-enhanced video annotation
ACM Transactions on Multimedia Computing, Communications, and Applications (TOMCCAP)
Embedding expression: Pervasive computing architecture for art and entertainment
Pervasive and Mobile Computing
SAMOS'06 Proceedings of the 6th international conference on Embedded Computer Systems: architectures, Modeling, and Simulation
Hi-index | 0.00 |
With the advent of tiny networked devices, MarkWeiser's vision of a world embedded with invisible computersis coming to age. Due to their small size and relative ease ofdeployment, sensor networks have been utilized by zoologists,seismologists and military personnel. In this paper, we investigatethe novel application of sensor networks to the film industry. Inparticular, we are interested in augmenting film and video footagewith sensor data. Unobtrusive sensors are deployed on a film setor in a television studio and on performers. During a filming of ascene, sensor data such as light intensity, color temperature andlocation are collected and synchronized with each film or videoframe. Later, editors, graphics artists and programmers can viewthis data in synchronization with film and video playback. Forexample, such data can help define a new level of seamless integrationbetween computer graphics and real world photography.A real-time version of our system would allow sensor data totrigger camera movement and cue special effects. In this paper,we discuss the design and implementation of the first part of ourembedded film set environment, the augmented recording system.Augmented recording is a foundational component for the UCLAHypermedia Studio's research into the use of sensor networks infilm and video production. In addition, we have evaluated oursystem in a television studio.