共 50 条
- [2] Evaluation of FractalDB Pre-training with Vision Transformers Seimitsu Kogaku Kaishi/Journal of the Japan Society for Precision Engineering, 2023, 89 (01): : 99 - 104
- [3] Pre-training of Graph Augmented Transformers for Medication Recommendation PROCEEDINGS OF THE TWENTY-EIGHTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2019, : 5953 - 5959
- [5] Unsupervised Extractive Summarization by Pre-training Hierarchical Transformers FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2020, 2020, : 1784 - 1795
- [6] Take a Closer Look at Multilinguality! Improve Multilingual Pre-Training Using Monolingual Corpora Only FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS - EMNLP 2023, 2023, : 2891 - 2907
- [7] PeCo: Perceptual Codebook for BERT Pre-training of Vision Transformers THIRTY-SEVENTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 37 NO 1, 2023, : 552 - 560
- [8] BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding 2019 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL HLT 2019), VOL. 1, 2019, : 4171 - 4186
- [9] Pre-training Vision Transformers with Very Limited Synthesized Images 2023 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2023), 2023, : 20303 - 20312