3D Moments from Near-Duplicate Photos

1Google Research   2Cornell Tech, Cornell University   3University of Washington

CVPR 2022

3D Moments turns a duplicate photo pair into space-time videos

Abstract

We introduce 3D Moments, a new computational photography effect. As input we take a pair of near-duplicate photos, i.e., photos of moving subjects from similar viewpoints, common in people's photo collections. As output, we produce a video that smoothly interpolates the scene motion from the first photo to the second, while also producing camera motion with parallax that gives a heightened sense of 3D. To achieve this effect, we represent the scene as a pair of feature-based layered depth images augmented with scene flow. This representation enables motion interpolation along with independent control of the camera viewpoint. Our system produces photo-realistic space-time videos with motion parallax and scene dynamics, while plausibly recovering regions occluded in the original views. We conduct extensive experiments demonstrating superior performance over baselines on public datasets and in-the-wild photos.

Video



More Results


Tracking camera motion (horizontal)

Tracking camera motion (vertical)

Zooming-In camera motion


Failure Cases


  
Our method cannot handle motions that are too large/challenging, and doesn't work well with incorrectly estimated monocular depths.

BibTeX


@inproceedings{wang2022_3dmoments,
  title     = {3D Moments from Near-Duplicate Photos},
  author    = {Wang, Qianqian and Li, Zhengqi and Salesin, David and Snavely, Noah and Curless, Brian 
               and Kontkanen, Janne},
  booktitle = {CVPR},
  year      = {2022}
}