共 50 条
- [1] Chinese Grammatical Correction Using BERT-based Pre-trained Model [J]. 1ST CONFERENCE OF THE ASIA-PACIFIC CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS AND THE 10TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (AACL-IJCNLP 2020), 2020, : 163 - 168
- [2] Patent classification with pre-trained Bert model [J]. JOURNAL OF THE FACULTY OF ENGINEERING AND ARCHITECTURE OF GAZI UNIVERSITY, 2024, 39 (04): : 2485 - 2496
- [5] CANCN-BERT: A Joint Pre-Trained Language Model for Classical and Modern Chinese [J]. PROCEEDINGS OF THE 30TH ACM INTERNATIONAL CONFERENCE ON INFORMATION & KNOWLEDGE MANAGEMENT, CIKM 2021, 2021, : 3112 - 3116
- [6] Using Pre-trained Deeply Contextual Model BERT for Russian Named Entity Recognition [J]. ANALYSIS OF IMAGES, SOCIAL NETWORKS AND TEXTS (AIST 2019), 2020, 1086 : 167 - 173
- [9] Entity Recognition for Chinese Hazardous Chemical Accident Data Based on Rules and a Pre-Trained Model [J]. APPLIED SCIENCES-BASEL, 2023, 13 (01):
- [10] Leveraging Pre-trained BERT for Audio Captioning [J]. 2022 30TH EUROPEAN SIGNAL PROCESSING CONFERENCE (EUSIPCO 2022), 2022, : 1145 - 1149