Neural Capture and Synthesis Conference Paper 2020

Adversarial Texture Optimization from RGB-D Scans

Paper Video
Thumb ticker sm justus thies
Neural Capture and Synthesis, Perzeptive Systeme
Max Planck Research Group Leader
Thumb  4

Realistic color texture generation is an important step in RGB-D surface reconstruction, but remains challenging in practice due to inaccuracies in reconstructed geometry, misaligned camera poses, and view-dependent imaging artifacts. In this work, we present a novel approach for color texture generation using a conditional adversarial loss obtained from weakly-supervised views. Specifically, we propose an approach to produce photorealistic textures for approximate surfaces, even from misaligned images, by learning an objective function that is robust to these errors. The key idea of our approach is to learn a patch-based conditional discriminator which guides the texture optimization to be tolerant to misalignments. Our discriminator takes a synthesized view and a real image, and evaluates whether the synthesized one is realistic, under a broadened definition of realism. We train the discriminator by providing as ‘real’ examples pairs of input views and their misaligned versions – so that the learned adversarial loss will tolerate errors from the scans. Experiments on synthetic and real data under quantitative or qualitative evaluation demonstrate the advantage of our approach in comparison to state of the art.

Author(s): Huang, Jingwei and Thies, Justus and Dai, Angela and Kundu, Abhijit and Jiang, Chiyu and Guibas, Leonidas and Nießner, Matthias and Funkhouser, Thomas
Links:
Book Title: Proceedings IEEE Conf. on Computer Vision and Pattern Recognition (CVPR)
Year: 2020
Bibtex Type: Conference Paper (inproceedings)
Event Name: IEEE International Conference on Computer Vision and Pattern Recognition (CVPR) 2020
Event Place: Seattle, USA
URL: https://justusthies.github.io/posts/advtex/
Electronic Archiving: grant_archive

BibTex

@inproceedings{huang2020advtex,
  title = {Adversarial Texture Optimization from RGB-D Scans},
  booktitle = { Proceedings IEEE Conf. on Computer Vision and Pattern Recognition (CVPR)},
  abstract = {Realistic color texture generation is an important step in RGB-D surface reconstruction, but remains challenging in practice due to inaccuracies in reconstructed geometry, misaligned camera poses, and view-dependent imaging artifacts. In this work, we present a novel approach for color texture generation using a conditional adversarial loss obtained from weakly-supervised views. Specifically, we propose an approach to produce photorealistic textures for approximate surfaces, even from misaligned images, by learning an objective function that is robust to these errors. The key idea of our approach is to learn a patch-based conditional discriminator which guides the texture optimization to be tolerant to misalignments. Our discriminator takes a synthesized view and a real image, and evaluates whether the synthesized one is realistic, under a broadened definition of realism. We train the discriminator by providing as ‘real’ examples pairs of input views and their misaligned versions – so that the learned adversarial loss will tolerate errors from the scans. Experiments on synthetic and real data under quantitative or qualitative evaluation demonstrate the advantage of our approach in comparison to state of the art.},
  year = {2020},
  slug = {huang2020advtex},
  author = {Huang, Jingwei and Thies, Justus and Dai, Angela and Kundu, Abhijit and Jiang, Chiyu and Guibas, Leonidas and Nie{\ss}ner, Matthias and Funkhouser, Thomas},
  url = {https://justusthies.github.io/posts/advtex/}
}