Knowledge-enhanced Prompt Learning for Open-domain Commonsense Reasoning

被引:0
|
作者
Zhao, Xujiang [1 ]
Liu, Yanchi [1 ]
Cheng, Wei [1 ]
Oishi, Mika [2 ]
Osaki, Takao [2 ]
Matsuda, Katsushi [2 ]
Chen, Haifeng [1 ]
机构
[1] NEC Laboratories America, United States
[2] Software and System Engineering Department
来源
NEC Technical Journal | 2024年 / 17卷 / 02期
关键词
D O I
暂无
中图分类号
学科分类号
摘要
Neural language models for commonsense reasoning often formulate the problem as a QA task and make predictions based on learned representations of language after fine-tuning. However, without providing any fine-tuning data and pre-defined answer candidates, can neural language models still answer commonsense reasoning questions only relying on external knowledge? In this work, we investigate a unique yet challenging problem-open-domain commonsense reasoning that aims to answer questions without providing any answer candidates and fine-tuning examples. A team comprising NECLA (NEC Laboratories America) and NEC Digital Business Platform Unit proposed method leverages neural language models to iteratively retrieve reasoning chains on the external knowledge base, which does not require task-specific supervision. The reasoning chains can help to identify the most precise answer to the commonsense question and its corresponding knowledge statements to justify the answer choice. This technology has proven its effectiveness in a diverse array of business domains. © 2024 NEC Mediaproducts. All rights reserved.
引用
收藏
页码:91 / 95
相关论文
共 50 条
  • [1] Generative commonsense knowledge subgraph retrieval for open-domain dialogue response generation
    Wu, Sixing
    Yu, Jiong
    Chen, Jiahao
    Zhou, Wei
    [J]. NEURAL NETWORKS, 2024, 180
  • [2] A Knowledge-Enhanced Pretraining Model for Commonsense Story Generation
    Guan, Jian
    Huang, Fei
    Zhao, Zhihao
    Zhu, Xiaoyan
    Huang, Minlie
    [J]. TRANSACTIONS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, 2020, 8 : 93 - 108
  • [3] MixEI: Mixing explicit and implicit commonsense knowledge in open-domain dialogue response generation
    Wu, Sixing
    Yu, Jiong
    Zhou, Wei
    [J]. Neurocomputing, 2025, 618
  • [4] Knowledge-Enhanced Prompt Learning for Few-Shot Text Classification
    Liu, Jinshuo
    Yang, Lu
    [J]. BIG DATA AND COGNITIVE COMPUTING, 2024, 8 (04)
  • [5] Generative Event Extraction via Internal Knowledge-Enhanced Prompt Learning
    Song, Hetian
    Zhu, Qingmeng
    Yu, Zhipeng
    Liang, Jian
    He, Hao
    [J]. ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING, ICANN 2023, PT V, 2023, 14258 : 90 - 102
  • [6] Improving Open-Domain Dialogue Response Generation with Multi-Source Multilingual Commonsense Knowledge
    Wu, Sixing
    Yu, Jiong
    Chen, Jiahao
    Deng, Xiaofan
    Zhou, Wei
    [J]. THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 17, 2024, : 19252 - 19260
  • [7] Towards Unified Conversational Recommender Systems via Knowledge-Enhanced Prompt Learning
    Wang, Xiaolei
    Zhou, Kun
    Wen, Ji-Rong
    Zhao, Wayne Xin
    [J]. PROCEEDINGS OF THE 28TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, KDD 2022, 2022, : 1929 - 1937
  • [8] Knowledge-enhanced Prompt-tuning for Stance Detection
    Huang, Hu
    Zhang, Bowen
    Li, Yangyang
    Zhang, Baoquan
    Sun, Yuxi
    Luo, Chuyao
    Peng, Cheng
    [J]. ACM TRANSACTIONS ON ASIAN AND LOW-RESOURCE LANGUAGE INFORMATION PROCESSING, 2023, 22 (06)
  • [9] COARSE-TO-CAREFUL: SEEKING SEMANTIC-RELATED KNOWLEDGE FOR OPEN-DOMAIN COMMONSENSE QUESTION ANSWERING
    Xing, Luxi
    Hu, Yue
    Yu, Jing
    Xie, Yuqiang
    Peng, Wei
    [J]. 2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021), 2021, : 7798 - 7802
  • [10] Learning to break: Knowledge-enhanced reasoning in multi-agent debate system
    Wang, Haotian
    Du, Xiyuan
    Yu, Weijiang
    Chen, Qianglong
    Zhu, Kun
    Chu, Zheng
    Yan, Lian
    Guan, Yi
    [J]. Neurocomputing, 2025, 618