共 50 条
- [1] Autocorrelation Matrix Knowledge Distillation: A Task-Specific Distillation Method for BERT Models [J]. Applied Sciences (Switzerland), 2024, 14 (20):
- [3] Improved Knowledge Distillation for Crowd Counting on IoT Devices [J]. 2023 IEEE INTERNATIONAL CONFERENCE ON EDGE COMPUTING AND COMMUNICATIONS, EDGE, 2023, : 207 - 214
- [4] Adversarial Data Augmentation for Task-Specific Knowledge Distillation of Pre-trained Transformers [J]. THIRTY-SIXTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FOURTH CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE / TWELVETH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2022, : 11685 - 11693
- [5] A model of learning task-specific knowledge for a new task [J]. PROCEEDINGS OF THE TWENTY FIRST ANNUAL CONFERENCE OF THE COGNITIVE SCIENCE SOCIETY, 1999, : 730 - 735
- [6] Repdistiller: Knowledge Distillation Scaled by Re-parameterization for Crowd Counting [J]. PATTERN RECOGNITION AND COMPUTER VISION, PRCV 2023, PT X, 2024, 14434 : 383 - 394
- [7] Incorporating Task-Specific Concept Knowledge into Script Learning [J]. 17TH CONFERENCE OF THE EUROPEAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EACL 2023, 2023, : 3026 - 3040
- [8] EXPERIENCE EFFECTS IN AUDITING - THE ROLE OF TASK-SPECIFIC KNOWLEDGE [J]. ACCOUNTING REVIEW, 1990, 65 (01): : 72 - 92
- [10] Task-specific knowledge of the law of pendulum motion in children and adults [J]. SWISS JOURNAL OF PSYCHOLOGY, 2005, 64 (02): : 103 - 114