共 50 条
- [1] Research frontiers of pre-training mathematical models based on BERT [J]. 2022 IEEE INTERNATIONAL CONFERENCE ON ELECTRICAL ENGINEERING, BIG DATA AND ALGORITHMS (EEBDA), 2022, : 154 - 158
- [2] Clone Detection with Pre-training Enhanced Code Representation [J]. Ruan Jian Xue Bao/Journal of Software, 2022, 33 (05): : 1758 - 1773
- [3] VulBERTa: Simplified Source Code Pre-Training for Vulnerability Detection [J]. 2022 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2022,
- [5] Contrastive Code-Comment Pre-training [J]. 2022 IEEE INTERNATIONAL CONFERENCE ON DATA MINING (ICDM), 2022, : 398 - 407
- [6] Multi-stage Pre-training over Simplified Multimodal Pre-training Models [J]. 59TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS AND THE 11TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING, VOL 1 (ACL-IJCNLP 2021), 2021, : 2556 - 2565
- [9] Pre-Training Transformers as Energy-Based Cloze Models [J]. PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 285 - 294