Projective Urban Texturing

Yiangos Georgiou, Melinos Averkiou, Tom Kelly, Evangelos Kalogerakis; in: 2021 International Conference on 3D Vision (3DV), pp. 1034–1043, IEEE 2021.

pdf and ppt code model author

Abstract

This paper proposes a method for automatic generation of textures for 3D city meshes in immersive urban environments. Many recent pipelines capture or synthesize large quantities of city geometry using scanners or procedural modeling pipelines. Such geometry is intricate and realistic, however the generation of photo-realistic textures for such large scenes remains a problem. We propose to generate textures for input target 3D meshes driven by the textural style present in readily available datasets of panoramic photos capturing urban environments. Re-targeting such 2D datasets to 3D geometry is challenging because the underlying shape, size, and layout of the urban structures in the photos do not correspond to the ones in the target meshes. Photos also often have objects (e.g., trees, vehicles) that may not even be present in the target geometry. To address these issues we present a method, called Projective Urban Texturing (PUT), which re-targets textural style from real-world panoramic images to unseen urban meshes. PUT relies on contrastive and adversarial training of a neural architecture designed for unpaired image-to-texture translation. The generated textures are stored in a texture atlas applied to the target 3D mesh geometry. To promote texture consistency, PUT employs an iterative procedure in which texture synthesis is conditioned on previously generated, adjacent textures. We demonstrate both quantitative and qualitative evaluation of the generated textures.

BibTeX

@inproceedings{georgiou2021projective,
title = {Projective Urban Texturing},
author = {Yiangos Georgiou and Melinos Averkiou and Tom Kelly and Evangelos Kalogerakis},
doi = {https://dx.doi.org/10.1109/3DV53792.2021.00111},
year  = {2021},
date = {2021-01-01},
booktitle = {2021 International Conference on 3D Vision (3DV)},
pages = {1034–1043},
organization = {IEEE},
}