A Computational Approach to Edge Detection
IEEE Transactions on Pattern Analysis and Machine Intelligence
Feature-based image metamorphosis
SIGGRAPH '92 Proceedings of the 19th annual conference on Computer graphics and interactive techniques
Performance of optical flow techniques
International Journal of Computer Vision
SIGGRAPH '96 Proceedings of the 23rd annual conference on Computer graphics and interactive techniques
SIGGRAPH '96 Proceedings of the 23rd annual conference on Computer graphics and interactive techniques
Rendering with concentric mosaics
Proceedings of the 26th annual conference on Computer graphics and interactive techniques
Proceedings of the 27th annual conference on Computer graphics and interactive techniques
Proceedings of the 27th annual conference on Computer graphics and interactive techniques
Unstructured lumigraph rendering
Proceedings of the 28th annual conference on Computer graphics and interactive techniques
A real-time distributed light field camera
EGRW '02 Proceedings of the 13th Eurographics workshop on Rendering
A Theory of Shape by Space Carving
International Journal of Computer Vision - Special issue on Genomic Signal Processing
Real-Time Video-Based Modeling and Rendering of 3D Scenes
IEEE Computer Graphics and Applications
When Is the Shape of a Scene Unique Given Its Light-Field: A Fundamental Theorem of 3D Vision?
IEEE Transactions on Pattern Analysis and Machine Intelligence
Hierarchical Model-Based Motion Estimation
ECCV '92 Proceedings of the Second European Conference on Computer Vision
Increasing Space-Time Resolution in Video
ECCV '02 Proceedings of the 7th European Conference on Computer Vision-Part I
Triangle: Engineering a 2D Quality Mesh Generator and Delaunay Triangulator
FCRC '96/WACG '96 Selected papers from the Workshop on Applied Computational Geormetry, Towards Geometric Engineering
Rendering of spherical light fields
PG '97 Proceedings of the 5th Pacific Conference on Computer Graphics and Applications
Real-Time Consensus-Based Scene Reconstruction Using Commodity Graphics Hardware
PG '02 Proceedings of the 10th Pacific Conference on Computer Graphics and Applications
ICCV '99 Proceedings of the International Conference on Computer Vision-Volume 2 - Volume 2
View-invariant Alignment and Matching of Video Sequences
ICCV '03 Proceedings of the Ninth IEEE International Conference on Computer Vision - Volume 2
ACM SIGGRAPH 2004 Papers
High-quality video view interpolation using a layered representation
ACM SIGGRAPH 2004 Papers
Towards space: time light field rendering
Proceedings of the 2005 symposium on Interactive 3D graphics and games
Image-based spatio-temporal modeling and view interpolation of dynamic events
ACM Transactions on Graphics (TOG)
High performance imaging using large camera arrays
ACM SIGGRAPH 2005 Papers
A self-reconfigurable camera array
SIGGRAPH '04 ACM SIGGRAPH 2004 Sketches
Space-time visual effects as a post-production process
Proceedings of the 1st international workshop on 3D video processing
Racking focus and tracking focus on live video streams: a stereo solution
The Visual Computer: International Journal of Computer Graphics
Hi-index | 0.00 |
In this paper, we propose a novel framework called space-time light field rendering, which allows continuous exploration of a dynamic scene in both space and time. Compared to existing light field capture/rendering systems, it offers the capability of using unsynchronized video inputs and the added freedom of controlling the visualization in the temporal domain, such as smooth slow motion and temporal integration. In order to synthesize novel views from any viewpoint at any time instant, we develop a two-stage rendering algorithm. We first interpolate in the temporal domain to generate globally synchronized images using a robust spatial-temporal image registration algorithm followed by edge-preserving image morphing. We then interpolate these software-synchronized images in the spatial domain to synthesize the final view. In addition, we introduce a very accurate and robust algorithm to estimate subframe temporal offsets among input video sequences. Experimental results from unsynchronized videos with or without time stamps show that our approach is capable of maintaining photorealistic quality from a variety of real scenes.