共 50 条
- [1] ERNIE 2.0: A Continual Pre-Training Framework for Language Understanding THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 8968 - 8975
- [2] PRE-TRAINING FOR QUERY REWRITING IN A SPOKEN LANGUAGE UNDERSTANDING SYSTEM 2020 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, 2020, : 7969 - 7973
- [3] Continual Pre-Training of Python Language Model to mT5 Computer Software, 2023, 40 (04): : 10 - 21
- [5] Unified Language Model Pre-training for Natural Language Understanding and Generation ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
- [7] Geo-BERT Pre-training Model for Query Rewriting in POI Search FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2021, 2021, : 2209 - 2214
- [8] Continual Pre-training of Language Models for Math Problem Understanding with Syntax-Aware Memory Network PROCEEDINGS OF THE 60TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), VOL 1: (LONG PAPERS), 2022, : 5923 - 5933
- [9] MPNet: Masked and Permuted Pre-training for Language Understanding ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
- [10] Speech Model Pre-training for End-to-End Spoken Language Understanding INTERSPEECH 2019, 2019, : 814 - 818