共 50 条
- [1] Pre-trained Language Model with Prompts for Temporal Knowledge Graph Completion [J]. arXiv, 2023,
- [2] Commonsense Knowledge Base Completion with Relational Graph Attention Network and Pre-trained Language Model [J]. PROCEEDINGS OF THE 31ST ACM INTERNATIONAL CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, CIKM 2022, 2022, : 4104 - 4108
- [3] SimKGC: Simple Contrastive Knowledge Graph Completion with Pre-trained Language Models [J]. PROCEEDINGS OF THE 60TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), VOL 1: (LONG PAPERS), 2022, : 4281 - 4294
- [4] NMT Enhancement based on Knowledge Graph Mining with Pre-trained Language Model [J]. 2020 22ND INTERNATIONAL CONFERENCE ON ADVANCED COMMUNICATION TECHNOLOGY (ICACT): DIGITAL SECURITY GLOBAL AGENDA FOR SAFE SOCIETY!, 2020, : 185 - 189
- [5] Interpretability of Entity Matching Based on Pre-trained Language Model [J]. Ruan Jian Xue Bao/Journal of Software, 2023, 34 (03): : 1087 - 1108
- [7] Enriching Pre-trained Language Model with Entity Information for Relation Classification [J]. PROCEEDINGS OF THE 28TH ACM INTERNATIONAL CONFERENCE ON INFORMATION & KNOWLEDGE MANAGEMENT (CIKM '19), 2019, : 2361 - 2364
- [8] MedBERT: A Pre-trained Language Model for Biomedical Named Entity Recognition [J]. PROCEEDINGS OF 2022 ASIA-PACIFIC SIGNAL AND INFORMATION PROCESSING ASSOCIATION ANNUAL SUMMIT AND CONFERENCE (APSIPA ASC), 2022, : 1482 - 1488
- [9] Knowledge Graph Completion Using a Pre-Trained Language Model Based on Categorical Information and Multi-Layer Residual Attention [J]. APPLIED SCIENCES-BASEL, 2024, 14 (11):
- [10] Knowledge Enhanced Pre-trained Language Model for Product Summarization [J]. NATURAL LANGUAGE PROCESSING AND CHINESE COMPUTING, NLPCC 2022, PT II, 2022, 13552 : 263 - 273