Commonsense Knowledge Reasoning and Generation with Pre-trained Language Models: A Survey

被引:0
|
作者
Bhargava, Prajjwal [1 ]
Ng, Vincent [1 ]
机构
[1] Univ Texas Dallas, Human Language Technol Res Inst, Richardson, TX 75083 USA
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
While commonsense knowledge acquisition and reasoning has traditionally been a core research topic in the knowledge representation and reasoning community, recent years have seen a surge of interest in the natural language processing community in developing pre-trained models and testing their ability to address a variety of newly designed commonsense knowledge reasoning and generation tasks. This paper presents a survey of these tasks, discusses the strengths and weaknesses of state-of-the-art pre-trained models for commonsense reasoning and generation as revealed by these tasks, and reflects on future research directions.
引用
收藏
页码:12317 / 12325
页数:9
相关论文
共 50 条
  • [21] Table and Image Generation for Investigating Knowledge of Entities in Pre-trained Vision and Language Models
    Kamigaito, Hidetaka
    Hayashi, Katsuhiko
    Watanabe, Taro
    [J]. 61ST CONFERENCE OF THE THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL 2023, VOL 2, 2023,
  • [22] Probing Simile Knowledge from Pre-trained Language Models
    Chen, Weijie
    Chang, Yongzhu
    Zhang, Rongsheng
    Pu, Jiashu
    Chen, Guandan
    Zhang, Le
    Xi, Yadong
    Chen, Yijiang
    Su, Chang
    [J]. PROCEEDINGS OF THE 60TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), VOL 1: (LONG PAPERS), 2022, : 5875 - 5887
  • [23] Continual knowledge infusion into pre-trained biomedical language models
    Jha, Kishlay
    Zhang, Aidong
    [J]. BIOINFORMATICS, 2022, 38 (02) : 494 - 502
  • [24] Pre-trained Language Models in Biomedical Domain: A Systematic Survey
    Wang, Benyou
    Xie, Qianqian
    Pei, Jiahuan
    Chen, Zhihong
    Tiwari, Prayag
    Li, Zhao
    Fu, Jie
    [J]. ACM COMPUTING SURVEYS, 2024, 56 (03)
  • [25] Exploring Pre-trained Language Models for Event Extraction and Generation
    Yang, Sen
    Feng, Dawei
    Qiao, Linbo
    Kan, Zhigang
    Li, Dongsheng
    [J]. 57TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2019), 2019, : 5284 - 5294
  • [26] STYLEDGPT: Stylized Response Generation with Pre-trained Language Models
    Yang, Ze
    Wu, Wei
    Xu, Can
    Liang, Xinnian
    Bai, Jiaqi
    Wang, Liran
    Wang, Wei
    Li, Zhoujun
    [J]. FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2020, 2020, : 1548 - 1559
  • [27] Commonsense Knowledge Base Completion with Relational Graph Attention Network and Pre-trained Language Model
    Ju, Jinghao
    Yang, Deqing
    Liu, Jingping
    [J]. PROCEEDINGS OF THE 31ST ACM INTERNATIONAL CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, CIKM 2022, 2022, : 4104 - 4108
  • [28] Pre-Trained Language Models and Their Applications
    Wang, Haifeng
    Li, Jiwei
    Wu, Hua
    Hovy, Eduard
    Sun, Yu
    [J]. ENGINEERING, 2023, 25 (51-65): : 51 - 65
  • [29] A Survey of Controllable Text Generation Using Transformer-based Pre-trained Language Models
    Zhang, Hanqing
    Song, Haolin
    Li, Shaoyu
    Zhou, Ming
    Song, Dawei
    [J]. ACM COMPUTING SURVEYS, 2024, 56 (03)
  • [30] Pre-trained language models with domain knowledge for biomedical extractive summarization
    Xie, Qianqian
    Bishop, Jennifer Amy
    Tiwari, Prayag
    Ananiadou, Sophia
    [J]. Knowledge-Based Systems, 2022, 252