Evolutionary Generative Adversarial Networks with Crossover Based Knowledge Distillation

被引:0
|
作者
Li, Junjie
Zhang, Junwei
Gong, Xiaoyu
Lu, Shuai [1 ]
机构
[1] Jilin Univ, Minist Educ, Key Lab Symbol Computat & Knowledge Engn, Changchun 130012, Peoples R China
基金
国家重点研发计划; 中国国家自然科学基金;
关键词
D O I
10.1109/IJCNN52387.2021.9533612
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Generative Adversarial Networks (GAN) is an adversarial model, and it has been demonstrated to be effective for various generative tasks. However, GAN and its variants also suffer from many training problems, such as mode collapse and gradient vanish. In this paper, we firstly propose a general crossover operator, which can be widely applied to GANs using evolutionary strategies. Then we design an evolutionary GAN framework named C-GAN based on it. And we combine the crossover operator with evolutionary generative adversarial networks (E-GAN) to implement the evolutionary generative adversarial networks with crossover (CE-GAN). Under the premise that a variety of loss functions are used as mutation operators to generate mutation individuals, we evaluate the generated samples and allow the mutation individuals to learn experiences from the output in a knowledge distillation manner, imitating the best output outcome, resulting in better offspring. Then, we greedily select the best offspring as parents for subsequent training using discriminator as an evaluator. Experiments on real datasets demonstrate the effectiveness of CE-GAN and show that our method is competitive in terms of generated images quality and time efficiency.
引用
收藏
页数:8
相关论文
共 50 条
  • [1] Research on Knowledge Distillation of Generative Adversarial Networks
    Wang, Wei
    Zhang, Baohua
    Cui, Tao
    Chai, Yimeng
    Li, Yue
    [J]. 2021 DATA COMPRESSION CONFERENCE (DCC 2021), 2021, : 376 - 376
  • [2] KDGAN: Knowledge Distillation with Generative Adversarial Networks
    Wang, Xiaojie
    Zhang, Rui
    Sun, Yu
    Qi, Jianzhong
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [3] Application of Knowledge Distillation in Generative Adversarial Networks
    Zhang, Xu
    [J]. 2023 3RD ASIA-PACIFIC CONFERENCE ON COMMUNICATIONS TECHNOLOGY AND COMPUTER SCIENCE, ACCTCS, 2023, : 65 - 71
  • [4] PKDGAN: Private Knowledge Distillation With Generative Adversarial Networks
    Zhuo, Cheng
    Gao, Di
    Liu, Liangwei
    [J]. IEEE Transactions on Big Data, 2024, 10 (06): : 775 - 788
  • [5] Evolutionary Generative Adversarial Networks
    Wang, Chaoyue
    Xu, Chang
    Yao, Xin
    Tao, Dacheng
    [J]. IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, 2019, 23 (06) : 921 - 934
  • [6] Private Knowledge Transfer via Model Distillation with Generative Adversarial Networks
    Gao, Di
    Zhuo, Cheng
    [J]. ECAI 2020: 24TH EUROPEAN CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2020, 325 : 1794 - 1801
  • [7] Enhanced Evolutionary Generative Adversarial Networks
    Mu, Jinzhen
    Zhou, Yan
    Cao, Shuqing
    Zhang, Yu
    Liu, Zongming
    [J]. PROCEEDINGS OF THE 39TH CHINESE CONTROL CONFERENCE, 2020, : 7534 - 7539
  • [8] Improving Evolutionary Generative Adversarial Networks
    Liu, Zheping
    Sabar, Nasser
    Song, Andy
    [J]. AI 2021: ADVANCES IN ARTIFICIAL INTELLIGENCE, 2022, 13151 : 691 - 702
  • [9] Spatial Evolutionary Generative Adversarial Networks
    Toutouh, Jamal
    Hemberg, Erik
    O'Reilly, Una-May
    [J]. PROCEEDINGS OF THE 2019 GENETIC AND EVOLUTIONARY COMPUTATION CONFERENCE (GECCO'19), 2019, : 472 - 480
  • [10] Distant Speech Recognition Based on Knowledge Distillation and Generative Adversarial Network
    Wu, Long
    Li, Ta
    Wang, Li
    Yan, Yong-Hong
    [J]. Ruan Jian Xue Bao/Journal of Software, 2019, 30 : 25 - 34