Active Exploration for Neural Global Illumination of Variable Scenes

被引:14
|
作者
Diolatzis, Stavros [1 ,2 ]
Philip, Julien [1 ,2 ,3 ]
Drettakis, George [1 ,2 ]
机构
[1] INRIA, F-06902 Sophia Antipolis, France
[2] Univ Cote Azur, GraphDeco, F-06902 Sophia Antipolis, France
[3] Adobe Res, London, England
来源
ACM TRANSACTIONS ON GRAPHICS | 2022年 / 41卷 / 05期
基金
中国国家自然科学基金;
关键词
Computer graphics; global illumination; neural rendering; neural networks; deep learning; OF-THE-ART;
D O I
10.1145/3522735
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
Neural rendering algorithms introduce a fundamentally new approach for photorealistic rendering, typically by learning a neural representation of illumination on large numbers of ground truth images. When training for a given variable scene, such as changing objects, materials, lights, and view-point, the space D of possible training data instances quickly becomes unmanageable as the dimensions of variable parameters increase. We introduce a novel Active Exploration method using Markov Chain Monte Carlo, which explores D, generating samples (i.e., ground truth renderings) that best help training and interleaves training and on-the-fly sample data generation. We introduce a self-tuning sample reuse strategy to minimize the expensive step of rendering training samples. We apply our approach on a neural generator that learns to render novel scene instances given an explicit parameterization of the scene configuration. Our results show that Active Exploration trains our network much more efficiently than uniformly sampling and, together with our resolution enhancement approach, achieves better quality than uniform sampling at convergence. Our method allows interactive rendering of hard light transport paths (e.g., complex caustics), which require very high samples counts to be captured, and provides dynamic scene navigation and manipulation, after training for 5 to 18 hours depending on required quality and variations.
引用
收藏
页数:18
相关论文
共 50 条
  • [21] Pooling spike neural network for fast rendering in global illumination
    Joseph Constantin
    Andre Bigand
    Ibtissam Constantin
    Neural Computing and Applications, 2020, 32 : 427 - 446
  • [22] Pooling Spike Neural Network for Acceleration of Global Illumination Rendering
    Constantin, Joseph
    Bigand, Andre
    Constantin, Ibtissam
    ADVANCES IN COMPUTATIONAL INTELLIGENCE, IWANN 2017, PT I, 2017, 10305 : 199 - 211
  • [23] Pooling spike neural network for fast rendering in global illumination
    Constantin, Joseph
    Bigand, Andre
    Constantin, Ibtissam
    NEURAL COMPUTING & APPLICATIONS, 2020, 32 (02): : 427 - 446
  • [24] Light Source Storage and Interpolation for Global Illumination: A Neural Solution
    Delepoulle, Samuel
    Renaud, Christophe
    Preux, Philippe
    INTELLIGENT COMPUTER GRAPHICS 2009, 2009, 240 : 87 - 104
  • [25] Detection of illumination changes in variegated scenes
    Zaidi, Q.
    De Bonet, J. S.
    Spehar, B.
    PERCEPTION, 1995, 24 : 12 - 12
  • [26] DETECTION OF ILLUMINATION CHANGES IN VARIEGATED SCENES
    ZAIDI, Q
    DEBONET, S
    SPEHAR, B
    INVESTIGATIVE OPHTHALMOLOGY & VISUAL SCIENCE, 1995, 36 (04) : S640 - S640
  • [27] Illumination discrimination in real and simulated scenes
    Radonjic, Ana
    Pearce, Bradley
    Aston, Stacey
    Krieger, Avery
    Dubin, Hilary
    Cottaris, Nicolas P.
    Brainard, David H.
    Hurlbert, Anya C.
    JOURNAL OF VISION, 2016, 16 (11):
  • [28] Tachistoscopic illumination and masking of real scenes
    Chichka, David
    Philbeck, John W.
    Gajewski, Daniel A.
    BEHAVIOR RESEARCH METHODS, 2015, 47 (01) : 45 - 52
  • [29] Color constancy for scenes with varying illumination
    Barnard, K
    Finlayson, G
    Funt, B
    COMPUTER VISION AND IMAGE UNDERSTANDING, 1997, 65 (02) : 311 - 321
  • [30] Tachistoscopic illumination and masking of real scenes
    David Chichka
    John W. Philbeck
    Daniel A. Gajewski
    Behavior Research Methods, 2015, 47 : 45 - 52