Synchronizing Video Sequences from Temporal Epipolar Lines Analysis
ACIVS '08 Proceedings of the 10th International Conference on Advanced Concepts for Intelligent Vision Systems
Auto-organized visual perception using distributed camera network
Robotics and Autonomous Systems
Camera Network Calibration and Synchronization from Silhouettes in Archived Video
International Journal of Computer Vision
Unstructured video-based rendering: interactive exploration of casually captured videos
ACM SIGGRAPH 2010 papers
Space-time-scale registration of dynamic scene reconstructions
ECCV'06 Proceedings of the 9th European conference on Computer Vision - Volume Part IV
Motion guided video sequence synchronization
ACCV'06 Proceedings of the 7th Asian conference on Computer Vision - Volume Part II
3D reconstruction and video-based rendering of casually captured videos
Proceedings of the 2010 international conference on Video Processing and Computational Video
Hi-index | 0.00 |
We propose an automatic approach to synchronize a network of uncalibrated and unsynchronized video cameras, and recover the complete calibration of all these cameras. In this paper, we extend recent work on computing the epipolar geometry from dynamic silhouettes, to deal with unsynchronized sequences and find the temporal offset between them. This is used to compute the fundamental matrices and the temporal offsets between many view-pairs in the network. Knowing the time-shifts between enough view-pairs allows us to robustly synchronize the whole network. The calibration of all the cameras is recovered from these fundamental matrices. The dynamic shape of the object can then be recovered using a visual-hull algorithm. Our method is especially useful for multi-camera shape-from-silhouette systems, as visual hulls can now be reconstructed without the need for a specific calibration session.