共 50 条
- [41] Reusing a Pretrained Language Model on Languages with Limited Corpora for Unsupervised NMT PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 2703 - 2711
- [42] On the Effectiveness of Adapter-based Tuning for Pretrained Language Model Adaptation 59TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS AND THE 11TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING, VOL 1 (ACL-IJCNLP 2021), 2021, : 2208 - 2222
- [44] KBioXLM: A Knowledge-anchored Biomedical Multilingual Pretrained Language Model FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (EMNLP 2023), 2023, : 11239 - 11250
- [45] Unsupervised Domain Adaptation of a Pretrained Cross-Lingual Language Model PROCEEDINGS OF THE TWENTY-NINTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2020, : 3672 - 3678
- [47] A Hybrid Citation Recommendation Model With SciBERT and GraphSAGE IEEE TRANSACTIONS ON SYSTEMS MAN CYBERNETICS-SYSTEMS, 2025, 55 (02): : 852 - 863
- [48] Chinese Prosodic Structure Prediction Based on a Pretrained Language Representation Model Zhang, Pengyuan (zhangpengyuan@hccl.ioa.ac.cn), 1600, Tianjin University (53): : 265 - 271
- [49] Pretrained Transformers for Text Ranking: BERT and Beyond SIGIR '21 - PROCEEDINGS OF THE 44TH INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL, 2021, : 2666 - 2668