共 28 条
- [1] Extract then Distill: Efficient and Effective Task-Agnostic BERT Distillation ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2021, PT III, 2021, 12893 : 570 - 581
- [2] Fundamentals of Task-Agnostic Data Valuation THIRTY-SEVENTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 37 NO 8, 2023, : 9226 - 9234
- [3] Investigating the Effectiveness of Task-Agnostic Prefix Prompt for Instruction Following THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 17, 2024, : 19386 - 19394
- [4] Continual deep reinforcement learning with task-agnostic policy distillation SCIENTIFIC REPORTS, 2024, 14 (01):
- [6] MINILM: Deep Self-Attention Distillation for Task-Agnostic Compression of Pre-Trained Transformers ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
- [7] TADA: Efficient Task-Agnostic Domain Adaptation for Transformers FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL 2023, 2023, : 487 - 503
- [9] Towards a Task-agnostic Distillation Methodology for Creating Edge Foundation Models PROCEEDINGS OF THE 2024 WORKSHOP ON EDGE AND MOBILE FOUNDATION MODELS, EDGEFM 2024, 2024, : 10 - 15
- [10] Task-Agnostic Self-Distillation for Few-Shot Action Recognition PROCEEDINGS OF THE THIRTY-THIRD INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2024, 2024, : 5425 - 5433