共 50 条
- [1] Synergistic Anchored Contrastive Pre-training for Few-Shot Relation Extraction [J]. THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 17, 2024, : 18742 - 18750
- [2] Multitask Pre-training of Modular Prompt for Chinese Few-Shot Learning [J]. PROCEEDINGS OF THE 61ST ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2023): LONG PAPERS, VOL 1, 2023, : 11156 - 11172
- [3] A lightweight approach based on prompt for few-shot relation extraction [J]. COMPUTER SPEECH AND LANGUAGE, 2024, 84
- [4] Effectiveness of Pre-training for Few-shot Intent Classification [J]. FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2021, 2021, : 1114 - 1120
- [5] Unified Multi-modal Pre-training for Few-shot Sentiment Analysis with Prompt-based Learning [J]. PROCEEDINGS OF THE 30TH ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA, MM 2022, 2022,
- [6] Continual Few-Shot Relation Extraction with Prompt-Based Contrastive Learning [J]. WEB AND BIG DATA, PT IV, APWEB-WAIM 2023, 2024, 14334 : 312 - 327
- [7] Few-Shot Dataset Distillation via Translative Pre-Training [J]. 2023 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2023), 2023, : 18608 - 18618
- [8] Consistent Prototype Learning for Few-Shot Continual Relation Extraction [J]. PROCEEDINGS OF THE 61ST ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL 2023, VOL 1, 2023, : 7409 - 7422
- [9] Label Semantic Aware Pre-training for Few-shot Text Classification [J]. PROCEEDINGS OF THE 60TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), VOL 1: (LONG PAPERS), 2022, : 8318 - 8334
- [10] A Prototype Network Enhanced Relation Semantic Representation for Few-shot Relation Extraction [J]. Human-Centric Intelligent Systems, 2023, 3 (1): : 1 - 12