共 50 条
- [2] Author Correction: A pre-trained BERT for Korean medical natural language processing [J]. Scientific Reports, 13
- [4] A Study of Pre-trained Language Models in Natural Language Processing [J]. 2020 IEEE INTERNATIONAL CONFERENCE ON SMART CLOUD (SMARTCLOUD 2020), 2020, : 116 - 121
- [8] Pre-trained models for natural language processing: A survey [J]. Science China Technological Sciences, 2020, 63 : 1872 - 1897
- [9] Revisiting Pre-trained Models for Chinese Natural Language Processing [J]. FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2020, 2020, : 657 - 668
- [10] Revisiting Pre-trained Language Models and their Evaluation for Arabic Natural Language Processing [J]. Proc. Conf. Empir. Methods Nat. Lang. Process., EMNLP, (3135-3151):