共 50 条
- [41] Research frontiers of pre-training mathematical models based on BERT 2022 IEEE INTERNATIONAL CONFERENCE ON ELECTRICAL ENGINEERING, BIG DATA AND ALGORITHMS (EEBDA), 2022, : 154 - 158
- [42] Task-adaptive Pre-training of Language Models withWord Embedding Regularization FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL-IJCNLP 2021, 2021, : 4546 - 4553
- [43] Pre-Training Transformers as Energy-Based Cloze Models PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 285 - 294
- [44] Exploring Sensory Knowledge and Pre-training Language Models for Chinese Metaphor Detection 2024 INTERNATIONAL CONFERENCE ON ASIAN LANGUAGE PROCESSING, IALP 2024, 2024, : 120 - 126
- [46] Evaluation of pre-training large language models on leadership-class supercomputers The Journal of Supercomputing, 2023, 79 : 20747 - 20768
- [47] Stop Pre-Training: Adapt Visual-Language Models to Unseen Languages 61ST CONFERENCE OF THE THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL 2023, VOL 2, 2023, : 366 - 375
- [48] Survey on Vision-language Pre-training Ruan Jian Xue Bao/Journal of Software, 2023, 34 (05): : 2000 - 2023
- [49] Sigmoid Loss for Language Image Pre-Training 2023 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2023), 2023, : 11941 - 11952
- [50] REPT: Bridging Language Models and Machine Reading Comprehension via Retrieval-Based Pre-training FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL-IJCNLP 2021, 2021, : 150 - 163