共 50 条
- [21] DeFormer: Decomposing Pre-trained Transformers for Faster Question Answering 58TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2020), 2020, : 4487 - 4497
- [22] Routing Generative Pre-Trained Transformers for Printed Circuit Board 2024 INTERNATIONAL SYMPOSIUM OF ELECTRONICS DESIGN AUTOMATION, ISEDA 2024, 2024, : 160 - 165
- [26] Causal Interpretation of Self-Attention in Pre-Trained Transformers ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
- [27] An Empirical Study of Pre-trained Transformers for Arabic Information Extraction PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 4727 - 4734
- [28] Handwritten Document Recognition Using Pre-trained Vision Transformers DOCUMENT ANALYSIS AND RECOGNITION-ICDAR 2024, PT II, 2024, 14805 : 173 - 190
- [29] Experiments in News Bias Detection with Pre-trained Neural Transformers ADVANCES IN INFORMATION RETRIEVAL, ECIR 2024, PT IV, 2024, 14611 : 270 - 284
- [30] Emotion Recognition with Pre-Trained Transformers Using Multimodal Signals 2022 10TH INTERNATIONAL CONFERENCE ON AFFECTIVE COMPUTING AND INTELLIGENT INTERACTION (ACII), 2022,