共 50 条
- [1] Research on Knowledge Distillation of Generative Adversarial Networks [J]. 2021 DATA COMPRESSION CONFERENCE (DCC 2021), 2021, : 376 - 376
- [2] KDGAN: Knowledge Distillation with Generative Adversarial Networks [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
- [3] Application of Knowledge Distillation in Generative Adversarial Networks [J]. 2023 3RD ASIA-PACIFIC CONFERENCE ON COMMUNICATIONS TECHNOLOGY AND COMPUTER SCIENCE, ACCTCS, 2023, : 65 - 71
- [4] PKDGAN: Private Knowledge Distillation With Generative Adversarial Networks [J]. IEEE Transactions on Big Data, 2024, 10 (06): : 775 - 788
- [6] Private Knowledge Transfer via Model Distillation with Generative Adversarial Networks [J]. ECAI 2020: 24TH EUROPEAN CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2020, 325 : 1794 - 1801
- [7] Enhanced Evolutionary Generative Adversarial Networks [J]. PROCEEDINGS OF THE 39TH CHINESE CONTROL CONFERENCE, 2020, : 7534 - 7539
- [8] Improving Evolutionary Generative Adversarial Networks [J]. AI 2021: ADVANCES IN ARTIFICIAL INTELLIGENCE, 2022, 13151 : 691 - 702
- [9] Spatial Evolutionary Generative Adversarial Networks [J]. PROCEEDINGS OF THE 2019 GENETIC AND EVOLUTIONARY COMPUTATION CONFERENCE (GECCO'19), 2019, : 472 - 480
- [10] Distant Speech Recognition Based on Knowledge Distillation and Generative Adversarial Network [J]. Ruan Jian Xue Bao/Journal of Software, 2019, 30 : 25 - 34