WL-GAN: Learning to sample in generative latent space

被引:0
|
作者
Hou, Zeyi [1 ]
Lang, Ning [2 ]
Zhou, Xiuzhuang [1 ]
机构
[1] Beijing Univ Posts & Telecommun, Sch Artificial Intelligence, Beijing 100876, Peoples R China
[2] Peking Univ Third Hosp, Beijing 100876, Peoples R China
基金
北京市自然科学基金; 中国国家自然科学基金;
关键词
Generative adversarial networks; Markov chain Monte Carlo; Energy based model; Mode dropping; CHAIN MONTE-CARLO; STOCHASTIC-APPROXIMATION;
D O I
10.1016/j.ins.2024.121834
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Recent advances in generative latent space sampling for enhanced generation quality have demonstrated the benefits from the Energy-Based Model (EBM), which is often defined by both the generator and the discriminator of off-the-shelf Generative Adversarial Networks (GANs) of many types. However, such latent space sampling may still suffer from mode dropping even sampling in a low-dimensional latent space, due to the inherent complexity of the data distributions with rugged energy landscapes. Motivated by the success of Wang-Landau (WL) sampling in statistical physics, we propose WL-GAN, a collaborative learning framework for generative latent space sampling, where both the invariant distribution and the proposal distribution of the Markov chain are jointly learned on the fly, by exploiting the historical statistics behind the simulated samples. We show that the two learning modules work together for better balance between exploration and exploitation over the energy space in GAN sampling, alleviating mode dropping and improving the sample quality of GAN. Empirically, the efficacy of WL-GAN is demonstrated on both synthetic datasets and real-world image datasets, using multiple GANs. Code is available at https://github.com/zeyihou/collaborative-learn.
引用
收藏
页数:16
相关论文
共 50 条
  • [1] Assessing Sample Quality via the Latent Space of Generative Models
    Xu, Jingyi
    Le, Hieu
    Samaras, Dimitris
    COMPUTER VISION - ECCV 2024, PT LIX, 2025, 15117 : 449 - 464
  • [2] Adaptive Learning of the Latent Space of Wasserstein Generative Adversarial Networks
    Qiu, Yixuan
    Gao, Qingyi
    Wang, Xiao
    JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2024,
  • [3] Improving Generative Adversarial Networks via Adversarial Learning in Latent Space
    Li, Yang
    Mo, Yichuan
    Shi, Liangliang
    Yan, Junchi
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [4] Generative Latent Implicit Conditional Optimization when Learning from Small Sample
    Azuri, Idan
    Weinshall, Daphna
    2020 25TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2021, : 8584 - 8591
  • [5] LSG-GAN: Latent space guided generative adversarial network for person pose transfer
    Lu, Yinwen
    Gu, Bingfei
    Ouyang, Wenbing
    Liu, Zheng
    Zou, Fengyuan
    Hou, Jue
    KNOWLEDGE-BASED SYSTEMS, 2023, 278
  • [6] Optimizing the Latent Space of Generative Networks
    Bojanowski, Piotr
    Joulin, Armand
    Paz, David Lopez
    Szlam, Arthur
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 80, 2018, 80
  • [7] Comparing the latent space of generative models
    Asperti, Andrea
    Tonelli, Valerio
    NEURAL COMPUTING & APPLICATIONS, 2023, 35 (04): : 3155 - 3172
  • [8] Comparing the latent space of generative models
    Andrea Asperti
    Valerio Tonelli
    Neural Computing and Applications, 2023, 35 : 3155 - 3172
  • [9] GAN Latent Space Manipulation and Aggregation for Federated Learning in Medical Imaging
    Pennisi, Matteo
    Salanitri, Federica Proietto
    Palazzo, Simone
    Pino, Carmelo
    Rundo, Francesco
    Giordano, Daniela
    Spampinato, Concetto
    DISTRIBUTED, COLLABORATIVE, AND FEDERATED LEARNING, AND AFFORDABLE AI AND HEALTHCARE FOR RESOURCE DIVERSE GLOBAL HEALTH, DECAF 2022, FAIR 2022, 2022, 13573 : 68 - 78
  • [10] Sample-Efficient Optimization in the Latent Space of Deep Generative Models via Weighted Retraining
    Tripp, Austin
    Daxberger, Erik
    Hernandez-Lobato, Jose Miguel
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33