Learning Implicit Generative Models by Teaching Density Estimators

被引:1
|
作者
Xu, Kun [1 ]
Du, Chao [1 ]
Li, Chongxuan [1 ]
Zhu, Jun [1 ]
Zhang, Bo [1 ]
机构
[1] Tsinghua Univ, Dept Comp Sci & Technol, Tsinghua Bosch ML Ctr, BNRist Ctr,Inst AI,THBI Lab, Beijing, Peoples R China
关键词
Deep generative models; Generative adversarial nets; Mode collapse problem;
D O I
10.1007/978-3-030-67661-2_15
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Implicit generative models are difficult to train as no explicit density functions are defined. Generative adversarial nets (GANs) present a minimax framework to train such models, which however can suffer from mode collapse due to the nature of the JS-divergence. This paper presents a learning by teaching (LBT) approach to learning implicit models, which intrinsically avoids the mode collapse problem by optimizing a KL-divergence rather than the JS-divergence in GANs. In LBT, an auxiliary density estimator is introduced to fit the implicit model's distribution while the implicit model teaches the density estimator to match the data distribution. LBT is formulated as a bilevel optimization problem, whose optimal generator matches the true data distribution. LBT can be naturally integrated with GANs to derive a hybrid LBT-GAN that enjoys complimentary benefits. Finally, we present a stochastic gradient ascent algorithm with unrolling to solve the challenging learning problems. Experimental results demonstrate the effectiveness of our method.
引用
收藏
页码:239 / 255
页数:17
相关论文
共 50 条
  • [1] Learning Generative Models Using Denoising Density Estimators
    Bigdeli, Siavash A.
    Lin, Geng
    Dunbar, L. Andrea
    Portenier, Tiziano
    Zwicker, Matthias
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (12) : 17730 - 17741
  • [2] Bridging Explicit and Implicit Deep Generative Models via Neural Stein Estimators
    Wu, Qitian
    Gao, Rui
    Zha, Hongyuan
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [3] Learning Implicit Generative Models by Matching Perceptual Features
    dos Santos, Cicero Nogueira
    Mroueh, Youssef
    Padhi, Inkit
    Dognin, Pierre
    2019 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2019), 2019, : 4460 - 4469
  • [4] Learning Implicit Generative Models with the Method of Learned Moments
    Ravuri, Suman
    Mohamed, Shakir
    Rosca, Mihaela
    Vinyals, Oriol
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 80, 2018, 80
  • [5] Application of generative deep learning models for approximation of mage distribution density
    Yashchenko, A., V
    Potapov, A. S.
    Rodionov, S. A.
    Zhdanov, I. N.
    Shcherbakov, O., V
    Peterson, M., V
    JOURNAL OF OPTICAL TECHNOLOGY, 2019, 86 (12) : 769 - 773
  • [6] Amortized Bayesian inference on generative dynamical network models of epilepsy using deep neural density estimators
    Hashemi, Meysam
    Vattikonda, Anirudh N.
    Jha, Jayant
    Sip, Viktor
    Woodman, Marmaduke M.
    Bartolomei, Fabrice
    Jirsa, Viktor K.
    NEURAL NETWORKS, 2023, 163 : 178 - 194
  • [7] Learning Implicit Fields for Generative Shape Modeling
    Chen, Zhiqin
    Zhang, Hao
    2019 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2019), 2019, : 5932 - 5941
  • [8] Visualization of Basketball Tactical Evolution in Generative AI Big Models for Teaching and Learning
    Liu, Zhuoxiao
    Applied Mathematics and Nonlinear Sciences, 2024, 9 (01)
  • [9] A Kernelised Stein Statistic for Assessing Implicit Generative Models
    Xu, Wenkai
    Reinert, Gesine
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35, NEURIPS 2022, 2022,
  • [10] Partial Identification of Treatment Effects with Implicit Generative Models
    Balazadeh, Vahid
    Syrgkanis, Vasilis
    Krishnan, Rahul G.
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,