共 50 条
- [21] Automating Code-Related Tasks Through Transformers: The Impact of Pre-training 2023 IEEE/ACM 45TH INTERNATIONAL CONFERENCE ON SOFTWARE ENGINEERING, ICSE, 2023, : 2425 - 2437
- [22] HIBERT: Document Level Pre-training of Hierarchical Bidirectional Transformers for Document Summarization 57TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2019), 2019, : 5059 - 5069
- [24] Designing Pre-training Datasets from Unlabeled Data for EEG Classification with Transformers 2024 IEEE 22ND MEDITERRANEAN ELECTROTECHNICAL CONFERENCE, MELECON 2024, 2024, : 25 - 30
- [25] TUTA: Tree-based Transformers for Generally Structured Table Pre-training KDD '21: PROCEEDINGS OF THE 27TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY & DATA MINING, 2021, : 1780 - 1790
- [27] SatMAE: Pre-training Transformers for Temporal and Multi-Spectral Satellite Imagery ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
- [28] Mask and Reason: Pre-Training Knowledge Graph Transformers for Complex Logical Queries PROCEEDINGS OF THE 28TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, KDD 2022, 2022, : 1120 - 1130