共 50 条
- [43] Improving the accuracy of pruned network using knowledge distillation [J]. Pattern Analysis and Applications, 2021, 24 : 819 - 830
- [44] Benchmarking Neural Topic Models: An Empirical Study [J]. FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL-IJCNLP 2021, 2021, : 4363 - 4368
- [45] Knowledge Distillation: Bad Models Can Be Good Role Models [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
- [48] Neural Machine Translation with Heterogeneous Topic Knowledge Embeddings [J]. 2021 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2021), 2021, : 3197 - 3202
- [49] Translating with Bilingual Topic Knowledge for Neural Machine Translation [J]. THIRTY-THIRD AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FIRST INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / NINTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2019, : 7257 - 7264
- [50] On using neural networks models for distillation control [J]. DISTILLATION AND ABSORPTION '97, VOLS 1 AND 2, 1997, (142): : 259 - 268