Detachable object detection with efficient model selection
EMMCVPR'11 Proceedings of the 8th international conference on Energy minimization methods in computer vision and pattern recognition
Foreground prediction for bilayer segmentation of videos
Pattern Recognition Letters
Sparse Occlusion Detection with Optical Flow
International Journal of Computer Vision
Ubiquitous low-cost sports training system for athletes
Proceedings of the 6th Euro American Conference on Telematics and Information Systems
3D modelling of static environments using multiple spherical stereo
ECCV'10 Proceedings of the 11th European conference on Trends and Topics in Computer Vision - Volume Part II
3D Scene Reconstruction from Multiple Spherical Stereo Pairs
International Journal of Computer Vision
Hi-index | 0.01 |
Optical flow can be reliably estimated between areas visible in two images, but not in occlusion areas. If optical flow is needed in the whole image domain, one approach is to use additional views of the same scene. If such views are unavailable, an often-used alternative is to extrapolate optical flow in occlusion areas. Since the location of such areas is usually unknown prior to optical flow estimation, this is usually performed in three steps. First, occlusion-ignorant optical flow is estimated, then occlusion areas are identified using the estimated (unreliable) optical flow, and, finally, the optical flow is corrected using the computed occlusion areas. This approach, however, does not permit interaction between optical flow and occlusion estimates. In this paper, we permit such interaction by proposing a variational formulation that jointly computes optical flow, implicitly detects occlusions and extrapolates optical flow in occlusion areas. The extrapolation mechanism is based on anisotropic diffusion and uses the underlying image gradient to preserve structure, such as optical flow discontinuities. Our results show significant improvements in the computed optical flow fields over other approaches, both qualitatively and quantitatively.