共 50 条
- [31] Emotion Recognition with Pre-Trained Transformers Using Multimodal Signals 2022 10TH INTERNATIONAL CONFERENCE ON AFFECTIVE COMPUTING AND INTELLIGENT INTERACTION (ACII), 2022,
- [32] Can Generative Pre-trained Transformers (GPT) Pass Assessments in Higher Education Programming Courses? PROCEEDINGS OF THE 2023 CONFERENCE ON INNOVATION AND TECHNOLOGY IN COMPUTER SCIENCE EDUCATION, ITICSE 2023, VOL 1, 2023, : 117 - 123
- [33] NODULE DETECTION IN CHEST RADIOGRAPHS WITH UNSUPERVISED PRE-TRAINED DETECTION TRANSFORMERS 2023 IEEE 20TH INTERNATIONAL SYMPOSIUM ON BIOMEDICAL IMAGING, ISBI, 2023,
- [35] Do Syntax Trees Help Pre-trained Transformers Extract Information? 16TH CONFERENCE OF THE EUROPEAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (EACL 2021), 2021, : 2647 - 2661
- [36] Unsupervised Out-of-Domain Detection via Pre-trained Transformers 59TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS AND THE 11TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING, VOL 1 (ACL-IJCNLP 2021), 2021, : 1052 - 1061
- [37] On Checking Robustness on Named Entity Recognition with Pre-trained Transformers Models BALTIC JOURNAL OF MODERN COMPUTING, 2023, 11 (04): : 591 - 606
- [38] Syntax-BERT: Improving Pre-trained Transformers with Syntax Trees 16TH CONFERENCE OF THE EUROPEAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (EACL 2021), 2021, : 3011 - 3020
- [40] Logical Transformers: Infusing Logical Structures into Pre-Trained Language Models FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL 2023, 2023, : 1762 - 1773