Exploring Sensory Knowledge and Pre-training Language Models for Chinese Metaphor Detection

被引:0
|
作者
Zhao, Qingqing [1 ]
Xiang, Xue [2 ]
Wang, Zhongqing [2 ]
机构
[1] Chinese Acad Social Sci, Inst Linguist, Beijing, Peoples R China
[2] Soochow Univ, Sch Comp Sci & Technol, Suzhou, Peoples R China
关键词
Metaphor detection; Sensory experiences; Neural network model; Mandarin Chinese;
D O I
10.1109/IALP63756.2024.10661181
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Processing of metaphorical languages has been one of the most challenging tasks in Natural Language Processing (NLP). Recent work utilizing neural models has received notable results on metaphor detection. However, the characteristic of metaphorical languages showing an intimate relationship with sensory experiences has not been given enough attention in NLP. This study proposes an innovative model for the task of Chinese metaphor detection by incorporating the conceptual knowledge of sensory experiences into a neural network model. Experiments show that our model outperforms the state-of-the-art baseline models significantly, hence contributing to the ongoing efforts to incorporate neuro-cognitive data in NLP tasks. In addition, the effectiveness of our model helps to deepen our understanding of metaphor that sensory experiences form the crucial part of the embodied nature of metaphorical languages.
引用
收藏
页码:120 / 126
页数:7
相关论文
共 50 条
  • [1] Pre-training Language Models for Comparative Reasoning
    Yu, Mengxia
    Zhang, Zhihan
    Yu, Wenhao
    Jiang, Meng
    2023 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2023), 2023, : 12421 - 12433
  • [2] VILA: On Pre-training for Visual Language Models
    Lin, Ji
    Yin, Hongxu
    Ping, Wei
    Molchanov, Pavlo
    Shoeybi, Mohammad
    Han, Song
    2024 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2024, : 26679 - 26689
  • [3] Contrastive Language-knowledge Graph Pre-training
    Yuan, Xiaowei
    Liu, Kang
    Wang, Yequan
    ACM TRANSACTIONS ON ASIAN AND LOW-RESOURCE LANGUAGE INFORMATION PROCESSING, 2024, 23 (04)
  • [4] WuDaoCorpora: A super large-scale Chinese corpora for pre-training language models
    Yuan, Sha
    Zhao, Hanyu
    Du, Zhengxiao
    Ding, Ming
    Liu, Xiao
    Cen, Yukuo
    Zou, Xu
    Yang, Zhilin
    Tang, Jie
    AI OPEN, 2021, 2 : 65 - 68
  • [5] Improving the Sample Efficiency of Pre-training Language Models
    Berend, Gabor
    ERCIM NEWS, 2024, (136): : 38 - 40
  • [6] Pre-training and diagnosing knowledge base completion models
    Kocijan, Vid
    Jang, Myeongjun
    Lukasiewicz, Thomas
    ARTIFICIAL INTELLIGENCE, 2024, 329
  • [7] Pre-Training Language Models for Identifying Patronizing and Condescending Language: An Analysis
    Perez-Almendros, Carla
    Espinosa-Anke, Luis
    Schockaert, Steven
    LREC 2022: THIRTEEN INTERNATIONAL CONFERENCE ON LANGUAGE RESOURCES AND EVALUATION, 2022, : 3902 - 3911
  • [8] JAKET: Joint Pre-training of Knowledge Graph and Language Understanding
    Yu, Donghan
    Zhu, Chenguang
    Yang, Yiming
    Zeng, Michael
    THIRTY-SIXTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FOURTH CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE / TWELVETH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2022, : 11630 - 11638
  • [9] PLOME: Pre-training with Misspelled Knowledge for Chinese Spelling Correction
    Liu, Shulin
    Yang, Tao
    Yue, Tianchi
    Zhang, Feng
    Wang, Di
    59TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS AND THE 11TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (ACL-IJCNLP 2021), VOL 1, 2021, : 2991 - 3000
  • [10] CLOP: Video-and-Language Pre-Training with Knowledge Regularizations
    Li, Guohao
    Yang, Hu
    He, Feng
    Feng, Zhifan
    Lyu, Yajuan
    Wu, Hua
    Wang, Haifeng
    PROCEEDINGS OF THE 30TH ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA, MM 2022, 2022, : 4584 - 4593