共 50 条
- [31] How Different are Pre-trained Transformers for Text Ranking? ADVANCES IN INFORMATION RETRIEVAL, PT II, 2022, 13186 : 207 - 214
- [33] Predicting Terms in IS-A Relations with Pre-trained Transformers 13TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING AND THE 3RD CONFERENCE OF THE ASIA-PACIFIC CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, IJCNLP-AACL 2023, 2023, : 134 - 148
- [34] Generative pre-trained transformers (GPT) for surface engineering SURFACE & COATINGS TECHNOLOGY, 2023, 466
- [35] Generating Extended and Multilingual Summaries with Pre-trained Transformers LREC 2022: THIRTEEN INTERNATIONAL CONFERENCE ON LANGUAGE RESOURCES AND EVALUATION, 2022, : 1640 - 1650
- [36] GENERATIVE PRE-TRAINED TRANSFORMERS FOR BIOLOGICALLY INSPIRED DESIGN PROCEEDINGS OF ASME 2022 INTERNATIONAL DESIGN ENGINEERING TECHNICAL CONFERENCES AND COMPUTERS AND INFORMATION IN ENGINEERING CONFERENCE, IDETC-CIE2022, VOL 6, 2022,
- [37] Pre-trained Language Models for the Legal Domain: A Case Study on Indian Law PROCEEDINGS OF THE 19TH INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND LAW, ICAIL 2023, 2023, : 187 - 196
- [38] INTENT CLASSIFICATION USING PRE-TRAINED LANGUAGE AGNOSTIC EMBEDDINGS FOR LOW RESOURCE LANGUAGES INTERSPEECH 2022, 2022, : 3473 - 3477