共 50 条
- [1] Unifying Graph Retrieval and Prompt Tuning for Graph-Grounded Text Classification [J]. PROCEEDINGS OF THE 47TH INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL, SIGIR 2024, 2024, : 2682 - 2686
- [2] Investigating the Pre-Training Bias in Low-Resource Abstractive Summarization [J]. IEEE ACCESS, 2024, 12 : 47219 - 47230
- [3] Pre-training on High-Resource Speech Recognition Improves Low-Resource Speech-to-Text Translation [J]. 2019 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL HLT 2019), VOL. 1, 2019, : 58 - 68
- [4] Self-Supervised Audio-and-Text Pre-training with Extremely Low-Resource Parallel Data [J]. THIRTY-SIXTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FOURTH CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE / TWELVETH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2022, : 10875 - 10883
- [5] Low-Resource Named Entity Recognition via the Pre-Training Model [J]. SYMMETRY-BASEL, 2021, 13 (05):
- [6] Multi-Stage Pre-training for Low-Resource Domain Adaptation [J]. PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 5461 - 5468
- [10] Low-Resource Neural Machine Translation Using XLNet Pre-training Model [J]. ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING, ICANN 2021, PT V, 2021, 12895 : 503 - 514