共 50 条
- [1] Adaptive Multi-Teacher Knowledge Distillation with Meta-Learning [J]. 2023 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO, ICME, 2023, : 1943 - 1948
- [2] Fast Network Alignment via Graph Meta-Learning [J]. IEEE INFOCOM 2020 - IEEE CONFERENCE ON COMPUTER COMMUNICATIONS, 2020, : 686 - 695
- [3] Boosting Graph Neural Networks via Adaptive Knowledge Distillation [J]. THIRTY-SEVENTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 37 NO 6, 2023, : 7793 - 7801
- [5] Knowledge Distillation for Model-Agnostic Meta-Learning [J]. ECAI 2020: 24TH EUROPEAN CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2020, 325 : 1355 - 1362
- [6] Decoupled knowledge distillation method based on meta-learning [J]. HIGH-CONFIDENCE COMPUTING, 2024, 4 (01):
- [7] TAdaNet: Task-Adaptive Network for Graph-Enriched Meta-Learning [J]. KDD '20: PROCEEDINGS OF THE 26TH ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY & DATA MINING, 2020, : 1789 - 1799
- [9] Boosting Meta-Learning Cold-Start Recommendation with Graph Neural Network [J]. PROCEEDINGS OF THE 32ND ACM INTERNATIONAL CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, CIKM 2023, 2023, : 4105 - 4109