共 50 条
- [1] Towards Inadequately Pre-trained Models in Transfer Learning [J]. 2023 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2023), 2023, : 19340 - 19351
- [2] Instabilities of Offline RL with Pre-Trained Neural Representation [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139
- [3] RanPAC: Random Projections and Pre-trained Models for Continual Learning [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
- [5] Collaborative Learning across Heterogeneous Systems with Pre-Trained Models [J]. THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 20, 2024, : 22668 - 22668
- [6] Meta Distant Transfer Learning for Pre-trained Language Models [J]. 2021 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2021), 2021, : 9742 - 9752
- [7] MODEL SPIDER: Learning to Rank Pre-Trained Models Efficiently [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
- [8] Do Pre-trained Models Benefit Equally in Continual Learning? [J]. 2023 IEEE/CVF WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION (WACV), 2023, : 6474 - 6482
- [9] Class-Incremental Learning with Strong Pre-trained Models [J]. 2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2022, : 9591 - 9600
- [10] LogME: Practical Assessment of Pre-trained Models for Transfer Learning [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139