Learning to Importance Sample in Primary Sample Space

被引:35
|
作者
Zheng, Quan [1 ]
Zwicker, Matthias [1 ]
机构
[1] Univ Maryland, College Pk, MD 20742 USA
基金
瑞士国家科学基金会;
关键词
CCS Concepts; Global illumination; Importance sampling; Neural networks; • Computing methodologies → Ray tracing;
D O I
10.1111/cgf.13628
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
Importance sampling is one of the most widely used variance reduction strategies in Monte Carlo rendering. We propose a novel importance sampling technique that uses a neural network to learn how to sample from a desired density represented by a set of samples. Our approach considers an existing Monte Carlo rendering algorithm as a black box. During a scene-dependent training phase, we learn to generate samples with a desired density in the primary sample space of the renderer using maximum likelihood estimation. We leverage a recent neural network architecture that was designed to represent real-valued non-volume preserving (Real NVP) transformations in high dimensional spaces. We use Real NVP to non-linearly warp primary sample space and obtain desired densities. In addition, Real NVP efficiently computes the determinant of the Jacobian of the warp, which is required to implement the change of integration variables implied by the warp. A main advantage of our approach is that it is agnostic of underlying light transport effects, and can be combined with an existing rendering technique by treating it as a black box. We show that our approach leads to effective variance reduction in several practical scenarios.
引用
收藏
页码:169 / 179
页数:11
相关论文
共 50 条
  • [31] Image Classification Learning Method Incorporating Zero-Sample Learning and Small-Sample Learning
    Sun, Fanglei
    Diao, Zhifeng
    MATHEMATICAL PROBLEMS IN ENGINEERING, 2022, 2022
  • [32] The Sample Complexity of Dictionary Learning
    Vainsencher, Daniel
    Mannor, Shie
    Bruckstein, Alfred M.
    JOURNAL OF MACHINE LEARNING RESEARCH, 2011, 12 : 3259 - 3281
  • [33] Learning to Learn and Sample BRDFs
    Liu, Chen
    Fischer, Michael
    Ritschel, Tobias
    COMPUTER GRAPHICS FORUM, 2023, 42 (02) : 201 - 211
  • [34] ON THE SAMPLE COMPLEXITY OF WEAK LEARNING
    GOLDMAN, SA
    KEARNS, MJ
    SCHAPIRE, RE
    INFORMATION AND COMPUTATION, 1995, 117 (02) : 276 - 287
  • [35] LEARNING TO SAMPLE FOR SPARSE SIGNALS
    Mulleti, Satish
    Zhang, Haiyang
    Eldar, Yonina C.
    2022 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2022, : 3363 - 3367
  • [36] Incremental learning with sample queries
    Ratsaby, J
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 1998, 20 (08) : 883 - 888
  • [37] Dense Sample Deep Learning
    Hanson, Stephen Jose
    Yadav, Vivek
    Hanson, Catherine
    NEURAL COMPUTATION, 2024, 36 (06) : 1228 - 1244
  • [38] Improving Virtual Sample Generation for Small Sample Learning with Dependent Attributes
    Lin, Liang-Sian
    Li, Der-Chiang
    Pan, Chih-Wei
    PROCEEDINGS 2016 5TH IIAI INTERNATIONAL CONGRESS ON ADVANCED APPLIED INFORMATICS IIAI-AAI 2016, 2016, : 715 - 718
  • [39] The cognitive demands of understanding the sample space
    Nunes T.
    Bryant P.
    Evans D.
    Gottardis L.
    Terlektsi M.-E.
    ZDM, 2014, 46 (3): : 437 - 448
  • [40] CONVERGENCE OF EXPERIMENTS ON A FINITE SAMPLE SPACE
    TORGERSE.EN
    ANNALS OF MATHEMATICAL STATISTICS, 1969, 40 (03): : 1155 - &