Exploring Sensory Knowledge and Pre-training Language Models for Chinese Metaphor Detection

被引:0
|
作者
Zhao, Qingqing [1 ]
Xiang, Xue [2 ]
Wang, Zhongqing [2 ]
机构
[1] Chinese Acad Social Sci, Inst Linguist, Beijing, Peoples R China
[2] Soochow Univ, Sch Comp Sci & Technol, Suzhou, Peoples R China
关键词
Metaphor detection; Sensory experiences; Neural network model; Mandarin Chinese;
D O I
10.1109/IALP63756.2024.10661181
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Processing of metaphorical languages has been one of the most challenging tasks in Natural Language Processing (NLP). Recent work utilizing neural models has received notable results on metaphor detection. However, the characteristic of metaphorical languages showing an intimate relationship with sensory experiences has not been given enough attention in NLP. This study proposes an innovative model for the task of Chinese metaphor detection by incorporating the conceptual knowledge of sensory experiences into a neural network model. Experiments show that our model outperforms the state-of-the-art baseline models significantly, hence contributing to the ongoing efforts to incorporate neuro-cognitive data in NLP tasks. In addition, the effectiveness of our model helps to deepen our understanding of metaphor that sensory experiences form the crucial part of the embodied nature of metaphorical languages.
引用
收藏
页码:120 / 126
页数:7
相关论文
共 50 条
  • [21] Does Pre-training Induce Systematic Inference? How Masked Language Models Acquire Commonsense Knowledge
    Porada, Ian
    Sordoni, Alessandro
    Cheung, Jackie Chi Kit
    NAACL 2022: THE 2022 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES, 2022, : 4550 - 4557
  • [22] Exploring Visual Pre-training for Robot Manipulation: Datasets, Models and Methods
    Jing, Ya
    Zhu, Xuelin
    Liu, Xingbin
    Sima, Qie
    Yang, Taozheng
    Feng, Yunhai
    Kong, Tao
    2023 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2023, : 11390 - 11395
  • [23] Transferable Multimodal Attack on Vision-Language Pre-training Models
    Wang, Haodi
    Dong, Kai
    Zhu, Zhilei
    Qin, Haotong
    Liu, Aishan
    Fang, Xiaolin
    Wang, Jiakai
    Liu, Xianglong
    45TH IEEE SYMPOSIUM ON SECURITY AND PRIVACY, SP 2024, 2024, : 1722 - 1740
  • [24] Pre-training and Evaluating Transformer-based Language Models for Icelandic
    Daoason, Jon Friorik
    Loftsson, Hrafn
    LREC 2022: THIRTEEN INTERNATIONAL CONFERENCE ON LANGUAGE RESOURCES AND EVALUATION, 2022, : 7386 - 7391
  • [25] Towards Adversarial Attack on Vision-Language Pre-training Models
    Zhang, Jiaming
    Yi, Qi
    Sang, Jitao
    PROCEEDINGS OF THE 30TH ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA, MM 2022, 2022, : 5005 - 5013
  • [26] Pre-training Universal Language Representation
    Li, Yian
    Zhao, Hai
    59TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS AND THE 11TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (ACL-IJCNLP 2021), VOL 1, 2021, : 5122 - 5133
  • [27] Knowledge Boosting: Rethinking Medical Contrastive Vision-Language Pre-training
    Chen, Xiaofei
    He, Yuting
    Xue, Cheng
    Ge, Rongjun
    Li, Shuo
    Yang, Guanyu
    MEDICAL IMAGE COMPUTING AND COMPUTER ASSISTED INTERVENTION, MICCAI 2023, PT I, 2023, 14220 : 405 - 415
  • [28] Knowledge Enhanced Pre-Training Model for Vision-Language-Navigation Task
    HUANG Jitao
    ZENG Guohui
    HUANG Bo
    GAO Yongbin
    LIU Jin
    SHI Zhicai
    WuhanUniversityJournalofNaturalSciences, 2021, 26 (02) : 147 - 155
  • [29] Graph Structure Enhanced Pre-Training Language Model for Knowledge Graph Completion
    Zhu, Huashi
    Xu, Dexuan
    Huang, Yu
    Jin, Zhi
    Ding, Weiping
    Tong, Jiahui
    Chong, Guoshuang
    IEEE TRANSACTIONS ON EMERGING TOPICS IN COMPUTATIONAL INTELLIGENCE, 2024, 8 (04): : 2697 - 2708
  • [30] Multi-stage Pre-training over Simplified Multimodal Pre-training Models
    Liu, Tongtong
    Feng, Fangxiang
    Wang, Xiaojie
    59TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS AND THE 11TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING, VOL 1 (ACL-IJCNLP 2021), 2021, : 2556 - 2565