共 50 条
- [1] Localized Symbolic Knowledge Distillation for Visual Commonsense Models [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
- [2] ISD-QA: Iterative Distillation of Commonsense Knowledge from General Language Models for Unsupervised Question Answering [J]. 2022 26TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2022, : 1229 - 1235
- [3] DSG-KD: Knowledge Distillation From Domain-Specific to General Language Models [J]. IEEE ACCESS, 2024, 12 : 130973 - 130982
- [4] Knowledge is Power: Symbolic Knowledge Distillation, Commonsense Morality, & Multimodal Script Knowledge [J]. WSDM'22: PROCEEDINGS OF THE FIFTEENTH ACM INTERNATIONAL CONFERENCE ON WEB SEARCH AND DATA MINING, 2022, : 3 - 3
- [5] Commonsense Knowledge Mining from Pretrained Models [J]. 2019 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING AND THE 9TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (EMNLP-IJCNLP 2019): PROCEEDINGS OF THE CONFERENCE, 2019, : 1173 - 1178
- [7] Preserving Commonsense Knowledge from Pre-trained Language Models via Causal Inference [J]. PROCEEDINGS OF THE 61ST ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2023): LONG PAPERS, VOL 1, 2023, : 9155 - 9173
- [8] Large Language Models as Commonsense Knowledge for Large-Scale Task Planning [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
- [9] Utilizing Language Models to Expand Vision-Based Commonsense Knowledge Graphs [J]. SYMMETRY-BASEL, 2022, 14 (08):
- [10] Incorporating Domain Knowledge and Semantic Information into Language Models for Commonsense Question Answering [J]. PROCEEDINGS OF THE 2021 IEEE 24TH INTERNATIONAL CONFERENCE ON COMPUTER SUPPORTED COOPERATIVE WORK IN DESIGN (CSCWD), 2021, : 1160 - 1165