共 50 条
- [21] Revisiting Weakly Supervised Pre-Training of Visual Perception Models [J]. 2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2022), 2022, : 794 - 804
- [22] Pre-training Summarization Models of Structured Datasets for Cardinality Estimation [J]. PROCEEDINGS OF THE VLDB ENDOWMENT, 2021, 15 (03): : 414 - 426
- [23] Progress in protein pre-training models integrating structural knowledge [J]. Wuli Xuebao/Acta Physica Sinica, 2024, 73 (18):
- [24] Improving Image Representations via MoCo Pre-training for Multimodal CXR Classification [J]. MEDICAL IMAGE UNDERSTANDING AND ANALYSIS, MIUA 2022, 2022, 13413 : 623 - 635
- [25] Pre-Training Transformers as Energy-Based Cloze Models [J]. PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 285 - 294
- [28] Supervised Contrastive Pre-training for Mammographic Triage Screening Models [J]. MEDICAL IMAGE COMPUTING AND COMPUTER ASSISTED INTERVENTION - MICCAI 2021, PT VII, 2021, 12907 : 129 - 139
- [29] UER: An Open-Source Toolkit for Pre-training Models [J]. 2019 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING AND THE 9TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (EMNLP-IJCNLP 2019): PROCEEDINGS OF SYSTEM DEMONSTRATIONS, 2019, : 241 - 246
- [30] On the importance of pre-training data volume for compact language models [J]. PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 7853 - 7858