共 50 条
- [41] Enriching Pre-trained Language Model with Entity Information for Relation Classification [J]. PROCEEDINGS OF THE 28TH ACM INTERNATIONAL CONFERENCE ON INFORMATION & KNOWLEDGE MANAGEMENT (CIKM '19), 2019, : 2361 - 2364
- [42] AMBERT: A Pre-trained Language Model with Multi-Grained Tokenization [J]. FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL-IJCNLP 2021, 2021, : 421 - 435
- [43] Leveraging Pre-Trained Language Model for Summary Generation on Short Text [J]. IEEE ACCESS, 2020, 8 : 228798 - 228803
- [44] Syntax-guided Contrastive Learning for Pre-trained Language Model [J]. FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), 2022, : 2430 - 2440
- [45] LaoPLM: Pre-trained Language Models for Lao [J]. LREC 2022: THIRTEEN INTERNATIONAL CONFERENCE ON LANGUAGE RESOURCES AND EVALUATION, 2022, : 6506 - 6512
- [46] Comparing Pre-Trained Language Model for Arabic Hate Speech Detection [J]. COMPUTACION Y SISTEMAS, 2024, 28 (02): : 681 - 693
- [47] Detection of Chinese Deceptive Reviews Based on Pre-Trained Language Model [J]. APPLIED SCIENCES-BASEL, 2022, 12 (07):
- [48] Pre-trained Language Model with Prompts for Temporal Knowledge Graph Completion [J]. arXiv, 2023,
- [49] JiuZhang: A Chinese Pre-trained Language Model for Mathematical Problem Understanding [J]. PROCEEDINGS OF THE 28TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, KDD 2022, 2022, : 4571 - 4581
- [50] Question-answering Forestry Pre-trained Language Model: ForestBERT [J]. Linye Kexue/Scientia Silvae Sinicae, 2024, 60 (09): : 99 - 110