共 50 条
- [2] Pre-Training Transformers as Energy-Based Cloze Models PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 285 - 294
- [3] Evaluation of FractalDB Pre-training with Vision Transformers Seimitsu Kogaku Kaishi/Journal of the Japan Society for Precision Engineering, 2023, 89 (01): : 99 - 104
- [4] RecGPT: Generative Pre-training for Text-based Recommendation PROCEEDINGS OF THE 62ND ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, VOL 2: SHORT PAPERS, 2024, : 302 - 313
- [5] Pre-training of Graph Augmented Transformers for Medication Recommendation PROCEEDINGS OF THE TWENTY-EIGHTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2019, : 5953 - 5959
- [6] Lifting the Curse of Multilinguality by Pre-training Modular Transformers NAACL 2022: THE 2022 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES, 2022, : 3479 - 3495
- [8] Unsupervised Extractive Summarization by Pre-training Hierarchical Transformers FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2020, 2020, : 1784 - 1795
- [10] TUTA: Tree-based Transformers for Generally Structured Table Pre-training KDD '21: PROCEEDINGS OF THE 27TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY & DATA MINING, 2021, : 1780 - 1790