共 50 条
- [1] On the Sentence Embeddings from Pre-trained Language Models [J]. PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 9119 - 9130
- [2] Capturing Semantics for Imputation with Pre-trained Language Models [J]. 2021 IEEE 37TH INTERNATIONAL CONFERENCE ON DATA ENGINEERING (ICDE 2021), 2021, : 61 - 72
- [3] Distilling Relation Embeddings from Pre-trained Language Models [J]. 2021 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2021), 2021, : 9044 - 9062
- [4] An Empirical study on Pre-trained Embeddings and Language Models for Bot Detection [J]. 4TH WORKSHOP ON REPRESENTATION LEARNING FOR NLP (REPL4NLP-2019), 2019, : 148 - 155
- [5] On the Branching Bias of Syntax Extracted from Pre-trained Language Models [J]. FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2020, 2020, : 4473 - 4478
- [6] Integrating Knowledge Graph Embeddings and Pre-trained Language Models in Hypercomplex Spaces [J]. SEMANTIC WEB, ISWC 2023, PART I, 2023, 14265 : 388 - 407
- [8] From Word Embeddings to Pre-Trained Language Models: A State-of-the-Art Walkthrough [J]. APPLIED SCIENCES-BASEL, 2022, 12 (17):
- [9] General Purpose Text Embeddings from Pre-trained Language Models for Scalable Inference [J]. FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2020, 2020,
- [10] Solving ESL Sentence Completion Questions via Pre-trained Neural Language Models [J]. ARTIFICIAL INTELLIGENCE IN EDUCATION (AIED 2021), PT II, 2021, 12749 : 256 - 261