共 50 条
- [1] On the Sentence Embeddings from Pre-trained Language Models [J]. PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 9119 - 9130
- [2] Distilling Relation Embeddings from Pre-trained Language Models [J]. 2021 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2021), 2021, : 9044 - 9062
- [3] An Empirical study on Pre-trained Embeddings and Language Models for Bot Detection [J]. 4TH WORKSHOP ON REPRESENTATION LEARNING FOR NLP (REPL4NLP-2019), 2019, : 148 - 155
- [4] Disentangling Semantics and Syntax in Sentence Embeddings with Pre-trained Language Models [J]. 2021 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL-HLT 2021), 2021, : 1372 - 1379
- [5] General Purpose Text Embeddings from Pre-trained Language Models for Scalable Inference [J]. FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2020, 2020,
- [6] Integrating Knowledge Graph Embeddings and Pre-trained Language Models in Hypercomplex Spaces [J]. SEMANTIC WEB, ISWC 2023, PART I, 2023, 14265 : 388 - 407
- [8] The impact of using pre-trained word embeddings in Sinhala chatbots [J]. 2020 20TH INTERNATIONAL CONFERENCE ON ADVANCES IN ICT FOR EMERGING REGIONS (ICTER-2020), 2020, : 161 - 165
- [9] Disambiguating Clinical Abbreviations using Pre-trained Word Embeddings [J]. HEALTHINF: PROCEEDINGS OF THE 14TH INTERNATIONAL JOINT CONFERENCE ON BIOMEDICAL ENGINEERING SYSTEMS AND TECHNOLOGIES - VOL. 5: HEALTHINF, 2021, : 501 - 508