共 50 条
- [1] Research on Chinese Intent Recognition Based on BERT pre-trained model [J]. 2020 5TH INTERNATIONAL CONFERENCE ON MATHEMATICS AND ARTIFICIAL INTELLIGENCE (ICMAI 2020), 2020, : 128 - 132
- [2] Patent classification with pre-trained Bert model [J]. JOURNAL OF THE FACULTY OF ENGINEERING AND ARCHITECTURE OF GAZI UNIVERSITY, 2024, 39 (04): : 2485 - 2496
- [3] Chinese Grammatical Correction Using BERT-based Pre-trained Model [J]. 1ST CONFERENCE OF THE ASIA-PACIFIC CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS AND THE 10TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (AACL-IJCNLP 2020), 2020, : 163 - 168
- [4] μBERT: Mutation Testing using Pre-Trained Language Models [J]. 2022 IEEE 15TH INTERNATIONAL CONFERENCE ON SOFTWARE TESTING, VERIFICATION AND VALIDATION WORKSHOPS (ICSTW 2022), 2022, : 160 - 169
- [8] Lawformer: A pre-trained language model for Chinese legal long documents [J]. AI OPEN, 2021, 2 : 79 - 84
- [9] AnchiBERT: A Pre-Trained Model for Ancient Chinese Language Understanding and Generation [J]. 2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
- [10] Detection of Chinese Deceptive Reviews Based on Pre-Trained Language Model [J]. APPLIED SCIENCES-BASEL, 2022, 12 (07):