PBR-GAN: Imitating Physically-Based Rendering With Generative Adversarial Networks

被引:0
|
作者
Li, Ru [1 ]
Dai, Peng [2 ]
Liu, Guanghui [3 ]
Zhang, Shengping [1 ]
Zeng, Bing [3 ]
Liu, Shuaicheng [3 ]
机构
[1] Harbin Inst Technol, Sch Comp Sci & Technol, Weihai 264209, Peoples R China
[2] Univ Hong Kong, Dept Elect & Elect Engn, Hong Kong, Peoples R China
[3] Univ Elect Sci & Technol China, Sch Informat & Commun Engn, Chengdu 611731, Peoples R China
基金
中国国家自然科学基金;
关键词
Rendering (computer graphics); Lighting; Decoding; Task analysis; Generative adversarial networks; Reflectivity; Color; Physically based rendering; generative adversarial network; illumination variation;
D O I
10.1109/TCSVT.2023.3298929
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
We propose a Generative Adversarial Network (GAN)-based architecture for achieving high-quality physically based rendering (PBR). Conventional PBR relies heavily on ray tracing, which is computationally expensive in complicated environments. Some recent deep learning-based methods can improve efficiency but cannot deal with illumination variation well. In this paper, we propose PBR-GAN, an end-to-end GAN-based network that solves these problems while generating natural photo-realistic images. Two encoders (the shading encoder and albedo encoder) and two decoders (the image decoder and light decoder) are introduced to achieve our target. The two encoders and the image decoder constitute the generator that learns the mapping between the generated domain and the real domain. The light decoder produces light maps that pay more attention to the highlight and shadow regions. The discriminator aims to optimize the generator by distinguishing target images from the generated ones. Three novel loss items, concentrating on domain translation, overall shading preservation, and light map estimation, are proposed to optimize the photo-realistic outputs. Furthermore, a real dataset is collected to provide realistic information for training GAN architecture. Extensive experiments indicate that PBR-GAN can preserve the illumination variation and improve the image perceptual quality.
引用
收藏
页码:1827 / 1840
页数:14
相关论文
共 50 条
  • [1] PBR-Net: Imitating Physically Based Rendering Using Deep Neural Network
    Dai, Peng
    Li, Zhuwen
    Zhang, Yinda
    Liu, Shuaicheng
    Zeng, Bing
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2020, 29 (29) : 5980 - 5992
  • [2] Physically-based Feature Line Rendering
    West, Rex
    ACM TRANSACTIONS ON GRAPHICS, 2021, 40 (06):
  • [3] Material model for physically-based rendering
    Robart, M
    Paulin, M
    Caubet, R
    POLARIZATION AND COLOR TECHNIQUES IN INDUSTRIAL INSPECTION, 1999, 3826 : 10 - 21
  • [4] Physically-based audio rendering of contact
    Avanzini, F
    Rath, M
    Rocchesso, D
    IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO, VOL I AND II, PROCEEDINGS, 2002, : A445 - A448
  • [5] A THEORETICAL FRAMEWORK FOR PHYSICALLY-BASED RENDERING
    LAFORTUNE, EP
    WILLEMS, YD
    COMPUTER GRAPHICS FORUM, 1994, 13 (02) : 97 - 107
  • [6] Imitating Driver Behavior with Generative Adversarial Networks
    Kuefler, Alex
    Morton, Jeremy
    Wheeler, Tim
    Kochenderfer, Mykel
    2017 28TH IEEE INTELLIGENT VEHICLES SYMPOSIUM (IV 2017), 2017, : 204 - 211
  • [7] Modeling and Rendering Physically-Based Wood Combustion
    Riensche, Roderick M.
    Lewis, Robert R.
    ADVANCES IN VISUAL COMPUTING, PT 1, PROCEEDINGS, 2009, 5875 : 896 - +
  • [8] A physically-based model for rendering realistic scratches
    Bosch, C
    Pueyo, X
    Mérillou, S
    Ghazanfarpour, D
    COMPUTER GRAPHICS FORUM, 2004, 23 (03) : 361 - 370
  • [9] AN INEXPENSIVE BRDF MODEL FOR PHYSICALLY-BASED RENDERING
    SCHLICK, C
    COMPUTER GRAPHICS FORUM, 1994, 13 (03) : C233 - &
  • [10] Physically-Based Rendering for Indoor Scene Understanding Using Convolutional Neural Networks
    Zhang, Yinda
    Song, Shuran
    Yumer, Ersin
    Savva, Manolis
    Lee, Joon-Young
    Jin, Hailin
    Funkhouser, Thomas
    30TH IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2017), 2017, : 5057 - 5065