共 50 条
- [1] Dict-BERT: Enhancing Language Model Pre-training with Dictionary FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), 2022, : 1907 - 1918
- [2] Improving the Identification of Abusive Language Through Careful Design of Pre-training Tasks PATTERN RECOGNITION, MCPR 2023, 2023, 13902 : 283 - 292
- [3] Kaleido-BERT: Vision-Language Pre-training on Fashion Domain 2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR 2021, 2021, : 12642 - 12652
- [4] A Domain-adaptive Pre-training Approach for Language Bias Detection in News 2022 ACM/IEEE JOINT CONFERENCE ON DIGITAL LIBRARIES (JCDL), 2022,
- [5] BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding 2019 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL HLT 2019), VOL. 1, 2019, : 4171 - 4186
- [6] Pre-Training BERT on Domain Resources for Short Answer Grading 2019 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING AND THE 9TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (EMNLP-IJCNLP 2019): PROCEEDINGS OF THE CONFERENCE, 2019, : 6071 - 6075
- [8] Domain-adaptive pre-training on a BERT model for the automatic detection of misogynistic tweets in Spanish Social Network Analysis and Mining, 13
- [9] MenuNER: Domain-Adapted BERT Based NER Approach for a Domain with Limited Dataset and Its Application to Food Menu Domain APPLIED SCIENCES-BASEL, 2021, 11 (13):