共 50 条
- [2] A complex network approach to analyse pre-trained language models for ancient Chinese [J]. ROYAL SOCIETY OPEN SCIENCE, 2024, 11 (05):
- [3] Revisiting Pre-trained Models for Chinese Natural Language Processing [J]. FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2020, 2020, : 657 - 668
- [6] A Data Cartography based MixUp for Pre-trained Language Models [J]. NAACL 2022: THE 2022 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES, 2022, : 4244 - 4250
- [9] A Transformer Based Approach To Detect Suicidal Ideation Using Pre-Trained Language Models [J]. 2020 23RD INTERNATIONAL CONFERENCE ON COMPUTER AND INFORMATION TECHNOLOGY (ICCIT 2020), 2020,
- [10] Annotating Columns with Pre-trained Language Models [J]. PROCEEDINGS OF THE 2022 INTERNATIONAL CONFERENCE ON MANAGEMENT OF DATA (SIGMOD '22), 2022, : 1493 - 1503