共 50 条
- [1] Continual Learning with Bayesian Model Based on a Fixed Pre-trained Feature Extractor [J]. MEDICAL IMAGE COMPUTING AND COMPUTER ASSISTED INTERVENTION - MICCAI 2021, PT V, 2021, 12905 : 397 - 406
- [2] Pre-trained Model Based Feature Envy Detection [J]. 2023 IEEE/ACM 20TH INTERNATIONAL CONFERENCE ON MINING SOFTWARE REPOSITORIES, MSR, 2023, : 430 - 440
- [3] SLCA: Slow Learner with Classifier Alignment for Continual Learning on a Pre-trained Model [J]. 2023 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2023), 2023, : 19091 - 19101
- [4] RanPAC: Random Projections and Pre-trained Models for Continual Learning [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
- [5] Do Pre-trained Models Benefit Equally in Continual Learning? [J]. 2023 IEEE/CVF WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION (WACV), 2023, : 6474 - 6482
- [7] Leveraging Pre-Trained Acoustic Feature Extractor For Affective Vocal Bursts Tasks [J]. PROCEEDINGS OF 2022 ASIA-PACIFIC SIGNAL AND INFORMATION PROCESSING ASSOCIATION ANNUAL SUMMIT AND CONFERENCE (APSIPA ASC), 2022, : 1412 - 1417
- [8] Preserving Cross-Linguality of Pre-trained Models via Continual Learning [J]. REPL4NLP 2021: PROCEEDINGS OF THE 6TH WORKSHOP ON REPRESENTATION LEARNING FOR NLP, 2021, : 64 - 71
- [10] A knowledge extraction framework for domain-specific application with simplified pre-trained language model and attention-based feature extractor [J]. Service Oriented Computing and Applications, 2022, 16 : 121 - 131