共 50 条
- [1] Disentangling Semantics and Syntax in Sentence Embeddings with Pre-trained Language Models [J]. 2021 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL-HLT 2021), 2021, : 1372 - 1379
- [2] Distilling Relation Embeddings from Pre-trained Language Models [J]. 2021 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2021), 2021, : 9044 - 9062
- [3] An Empirical study on Pre-trained Embeddings and Language Models for Bot Detection [J]. 4TH WORKSHOP ON REPRESENTATION LEARNING FOR NLP (REPL4NLP-2019), 2019, : 148 - 155
- [4] From Word Embeddings to Pre-Trained Language Models: A State-of-the-Art Walkthrough [J]. APPLIED SCIENCES-BASEL, 2022, 12 (17):
- [5] General Purpose Text Embeddings from Pre-trained Language Models for Scalable Inference [J]. FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2020, 2020,
- [6] Integrating Knowledge Graph Embeddings and Pre-trained Language Models in Hypercomplex Spaces [J]. SEMANTIC WEB, ISWC 2023, PART I, 2023, 14265 : 388 - 407
- [9] From Cloze to Comprehension: Retrofitting Pre-trained Masked Language Models to Pre-trained Machine Reader [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
- [10] Solving ESL Sentence Completion Questions via Pre-trained Neural Language Models [J]. ARTIFICIAL INTELLIGENCE IN EDUCATION (AIED 2021), PT II, 2021, 12749 : 256 - 261