共 50 条
- [1] SimKGC: Simple Contrastive Knowledge Graph Completion with Pre-trained Language Models [J]. PROCEEDINGS OF THE 60TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), VOL 1: (LONG PAPERS), 2022, : 4281 - 4294
- [2] ERICA: Improving Entity and Relation Understanding for Pre-trained Language Models via Contrastive Learning [J]. 59TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS AND THE 11TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (ACL-IJCNLP 2021), VOL 1, 2021, : 3350 - 3363
- [3] An empirical study of pre-trained language models in simple knowledge graph question answering [J]. World Wide Web, 2023, 26 : 2855 - 2886
- [4] An empirical study of pre-trained language models in simple knowledge graph question answering [J]. WORLD WIDE WEB-INTERNET AND WEB INFORMATION SYSTEMS, 2023, 26 (05): : 2855 - 2886
- [5] ContraBERT: Enhancing Code Pre-trained Models via Contrastive Learning [J]. 2023 IEEE/ACM 45TH INTERNATIONAL CONFERENCE ON SOFTWARE ENGINEERING, ICSE, 2023, : 2476 - 2487
- [8] An Empirical study on Pre-trained Embeddings and Language Models for Bot Detection [J]. 4TH WORKSHOP ON REPRESENTATION LEARNING FOR NLP (REPL4NLP-2019), 2019, : 148 - 155
- [10] Leveraging pre-trained language models for code generation [J]. COMPLEX & INTELLIGENT SYSTEMS, 2024, 10 (03) : 3955 - 3980