共 50 条
- [1] Multi-stage Pre-training over Simplified Multimodal Pre-training Models [J]. 59TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS AND THE 11TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING, VOL 1 (ACL-IJCNLP 2021), 2021, : 2556 - 2565
- [2] Pre-training Mention Representations in Coreference Models [J]. PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 8534 - 8540
- [4] A Method of Relation Extraction Using Pre-training Models [J]. 2020 13TH INTERNATIONAL SYMPOSIUM ON COMPUTATIONAL INTELLIGENCE AND DESIGN (ISCID 2020), 2020, : 176 - 179
- [5] Improving the Sample Efficiency of Pre-training Language Models [J]. ERCIM NEWS, 2024, (136): : 38 - 40
- [8] Pre-training with Diffusion Models for Dental Radiography Segmentation [J]. DEEP GENERATIVE MODELS, DGM4MICCAI 2023, 2024, 14533 : 174 - 182
- [9] Research frontiers of pre-training mathematical models based on BERT [J]. 2022 IEEE INTERNATIONAL CONFERENCE ON ELECTRICAL ENGINEERING, BIG DATA AND ALGORITHMS (EEBDA), 2022, : 154 - 158
- [10] AUGER: Automatically Generating Review Comments with Pre-training Models [J]. PROCEEDINGS OF THE 30TH ACM JOINT MEETING EUROPEAN SOFTWARE ENGINEERING CONFERENCE AND SYMPOSIUM ON THE FOUNDATIONS OF SOFTWARE ENGINEERING, ESEC/FSE 2022, 2022, : 1009 - 1021