Generative Appearance Flow: A Hybrid Approach for Outdoor View Synthesis

Muhammad Usman Rafique, Hunter Blanton, Noah Snavely, Nathan Jacobs

Research output: Contribution to conferencePaperpeer-review

2 Scopus citations

Abstract

We address the problem of view synthesis in complex outdoor scenes. We propose a novel convolutional neural network architecture that includes flow-based and direct synthesis sub-networks. Both sub-networks introduce novel elements that greatly improve the quality of the synthesized images. These images are then adaptively fused to create the final output image. Our approach achieves state-of-the-art performance on the KITTI dataset, which is commonly used to evaluate view-synthesis methods. Unlike many recently proposed methods, ours is trained without the need for additional geometric constraints, such as a ground-truth depth map, making it more broadly applicable. Our approach also achieved the best performance on the Brooklyn Panorama Synthesis dataset, which we introduce as a new, challenging benchmark for view synthesis. Our dataset, code, and pretrained models are available at https://mvrl.github.io/GAF.

Original languageEnglish
StatePublished - 2020
Event31st British Machine Vision Conference, BMVC 2020 - Virtual, Online
Duration: Sep 7 2020Sep 10 2020

Conference

Conference31st British Machine Vision Conference, BMVC 2020
CityVirtual, Online
Period09/7/2009/10/20

Fingerprint

Dive into the research topics of 'Generative Appearance Flow: A Hybrid Approach for Outdoor View Synthesis'. Together they form a unique fingerprint.

Cite this