共 50 条
- [31] Graph Strength for Identification of Pre-training Desynchronization [J]. INTELLIGENT TECHNOLOGIES: DESIGN AND APPLICATIONS FOR SOCIETY, CITIS 2022, 2023, 607 : 36 - 44
- [32] MarkupLM: Pre-training of Text and Markup Language for Visually Rich Document Understanding [J]. PROCEEDINGS OF THE 60TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), VOL 1: (LONG PAPERS), 2022, : 6078 - 6087
- [33] Knowledge Boosting: Rethinking Medical Contrastive Vision-Language Pre-training [J]. MEDICAL IMAGE COMPUTING AND COMPUTER ASSISTED INTERVENTION, MICCAI 2023, PT I, 2023, 14220 : 405 - 415
- [35] A New Pre-training Method for Training Deep Learning Models with Application to Spoken Language Understanding [J]. 17TH ANNUAL CONFERENCE OF THE INTERNATIONAL SPEECH COMMUNICATION ASSOCIATION (INTERSPEECH 2016), VOLS 1-5: UNDERSTANDING SPEECH PROCESSING IN HUMANS AND MACHINES, 2016, : 3255 - 3259
- [36] Task-adaptive Pre-training and Self-training are Complementary for Natural Language Understanding [J]. FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2021, 2021, : 1006 - 1015
- [38] KPGT: Knowledge-Guided Pre-training of Graph Transformer for Molecular Property Prediction [J]. PROCEEDINGS OF THE 28TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, KDD 2022, 2022, : 857 - 867
- [39] Graph-Aware Language Model Pre-Training on a Large Graph Corpus Can Help Multiple Graph Applications [J]. PROCEEDINGS OF THE 29TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, KDD 2023, 2023, : 5270 - 5281
- [40] PreQR: Pre-training Representation for SQL Understanding [J]. PROCEEDINGS OF THE 2022 INTERNATIONAL CONFERENCE ON MANAGEMENT OF DATA (SIGMOD '22), 2022, : 204 - 216