共 50 条
- [41] A Domain-adaptive Pre-training Approach for Language Bias Detection in News 2022 ACM/IEEE JOINT CONFERENCE ON DIGITAL LIBRARIES (JCDL), 2022,
- [42] Pre-training for Spoken Language Understanding with Joint Textual and Phonetic Representation Learning INTERSPEECH 2021, 2021, : 1244 - 1248
- [44] Kaleido-BERT: Vision-Language Pre-training on Fashion Domain 2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR 2021, 2021, : 12642 - 12652
- [45] MarkupLM: Pre-training of Text and Markup Language for Visually Rich Document Understanding PROCEEDINGS OF THE 60TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), VOL 1: (LONG PAPERS), 2022, : 6078 - 6087
- [46] Continual Mixed-Language Pre-Training for Extremely Low-Resource Neural Machine Translation FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL-IJCNLP 2021, 2021, : 2706 - 2718
- [47] A New Pre-training Method for Training Deep Learning Models with Application to Spoken Language Understanding 17TH ANNUAL CONFERENCE OF THE INTERNATIONAL SPEECH COMMUNICATION ASSOCIATION (INTERSPEECH 2016), VOLS 1-5: UNDERSTANDING SPEECH PROCESSING IN HUMANS AND MACHINES, 2016, : 3255 - 3259
- [48] Task-adaptive Pre-training and Self-training are Complementary for Natural Language Understanding FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2021, 2021, : 1006 - 1015
- [49] MGeo: Multi-Modal Geographic Language Model Pre-Training PROCEEDINGS OF THE 46TH INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL, SIGIR 2023, 2023, : 185 - 194
- [50] Knowledge distilled pre-training model for vision-language-navigation Applied Intelligence, 2023, 53 : 5607 - 5619