共 50 条
- [41] Capturing Semantics for Imputation with Pre-trained Language Models [J]. 2021 IEEE 37TH INTERNATIONAL CONFERENCE ON DATA ENGINEERING (ICDE 2021), 2021, : 61 - 72
- [42] Compressing Pre-trained Language Models by Matrix Decomposition [J]. 1ST CONFERENCE OF THE ASIA-PACIFIC CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS AND THE 10TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (AACL-IJCNLP 2020), 2020, : 884 - 889
- [43] On the Sentence Embeddings from Pre-trained Language Models [J]. PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 9119 - 9130
- [44] Pre-trained language models for keyphrase prediction: A review [J]. ICT EXPRESS, 2024, 10 (04): : 871 - 890
- [45] Robust Lottery Tickets for Pre-trained Language Models [J]. PROCEEDINGS OF THE 60TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), VOL 1: (LONG PAPERS), 2022, : 2211 - 2224
- [47] Pre-trained models for natural language processing: A survey [J]. Science China Technological Sciences, 2020, 63 : 1872 - 1897
- [48] Leveraging pre-trained language models for code generation [J]. COMPLEX & INTELLIGENT SYSTEMS, 2024, 10 (03) : 3955 - 3980
- [50] Learning and Evaluating a Differentially Private Pre-trained Language Model [J]. FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2021, 2021, : 1178 - 1189