Exploring Sensory Knowledge and Pre-training Language Models for Chinese Metaphor Detection

被引:0
|
作者
Zhao, Qingqing [1 ]
Xiang, Xue [2 ]
Wang, Zhongqing [2 ]
机构
[1] Chinese Acad Social Sci, Inst Linguist, Beijing, Peoples R China
[2] Soochow Univ, Sch Comp Sci & Technol, Suzhou, Peoples R China
关键词
Metaphor detection; Sensory experiences; Neural network model; Mandarin Chinese;
D O I
10.1109/IALP63756.2024.10661181
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Processing of metaphorical languages has been one of the most challenging tasks in Natural Language Processing (NLP). Recent work utilizing neural models has received notable results on metaphor detection. However, the characteristic of metaphorical languages showing an intimate relationship with sensory experiences has not been given enough attention in NLP. This study proposes an innovative model for the task of Chinese metaphor detection by incorporating the conceptual knowledge of sensory experiences into a neural network model. Experiments show that our model outperforms the state-of-the-art baseline models significantly, hence contributing to the ongoing efforts to incorporate neuro-cognitive data in NLP tasks. In addition, the effectiveness of our model helps to deepen our understanding of metaphor that sensory experiences form the crucial part of the embodied nature of metaphorical languages.
引用
收藏
页码:120 / 126
页数:7
相关论文
共 50 条
  • [41] Knowledge Enhancement and Optimization Strategies for Remote Sensing Image Captioning Using Contrastive Language Image Pre-training and Large Language Models
    Wang, Xinren
    Wan, Tengfei
    Song, Jianning
    Huang, Jingmeng
    PROCEEDINGS OF 2024 INTERNATIONAL CONFERENCE ON MACHINE INTELLIGENCE AND DIGITAL APPLICATIONS, MIDA2024, 2024, : 313 - 318
  • [42] Evaluation of pre-training large language models on leadership-class supercomputers
    Junqi Yin
    Sajal Dash
    John Gounley
    Feiyi Wang
    Georgia Tourassi
    The Journal of Supercomputing, 2023, 79 : 20747 - 20768
  • [43] Stop Pre-Training: Adapt Visual-Language Models to Unseen Languages
    Karouimu, Yasmine
    Lebret, Remi
    Foroutan, Negar
    Aberer, Karl
    61ST CONFERENCE OF THE THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL 2023, VOL 2, 2023, : 366 - 375
  • [44] Survey on Vision-language Pre-training
    Yin J.
    Zhang Z.-D.
    Gao Y.-H.
    Yang Z.-W.
    Li L.
    Xiao M.
    Sun Y.-Q.
    Yan C.-G.
    Ruan Jian Xue Bao/Journal of Software, 2023, 34 (05): : 2000 - 2023
  • [45] A New Pre-training Method for Training Deep Learning Models with Application to Spoken Language Understanding
    Celikyilmaz, Asli
    Sarikaya, Ruhi
    Hakkani-Tur, Dilek
    Liu, Xiaohu
    Ramesh, Nikhil
    Tur, Gokhan
    17TH ANNUAL CONFERENCE OF THE INTERNATIONAL SPEECH COMMUNICATION ASSOCIATION (INTERSPEECH 2016), VOLS 1-5: UNDERSTANDING SPEECH PROCESSING IN HUMANS AND MACHINES, 2016, : 3255 - 3259
  • [46] Unsupervised Pre-Training for Detection Transformers
    Dai, Zhigang
    Cai, Bolun
    Lin, Yugeng
    Chen, Junying
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2023, 45 (11) : 12772 - 12782
  • [47] External Knowledge Infusion for Tabular Pre-training Models with Dual-adapters
    Qin, Can
    Kim, Sungchul
    Zhao, Handong
    Yu, Tong
    Rossi, Ryan A.
    Fu, Yun
    PROCEEDINGS OF THE 28TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, KDD 2022, 2022, : 1401 - 1409
  • [48] Sigmoid Loss for Language Image Pre-Training
    Zhai, Xiaohua
    Mustafa, Basil
    Kolesnikov, Alexander
    Beyer, Lucas
    2023 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2023), 2023, : 11941 - 11952
  • [49] Grounded Language-Image Pre-training
    Li, Liunian Harold
    Zhang, Pengchuan
    Zhang, Haotian
    Yang, Jianwei
    Li, Chunyuan
    Zhong, Yiwu
    Wang, Lijuan
    Yuan, Lu
    Zhang, Lei
    Hwang, Jenq-Neng
    Chang, Kai-Wei
    Gao, Jianfeng
    2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2022, : 10955 - 10965
  • [50] A Domain-adaptive Pre-training Approach for Language Bias Detection in News
    Krieger, Jan-David
    Spinde, Timo
    Ruas, Terry
    Kulshrestha, Juhi
    Gipp, Bela
    2022 ACM/IEEE JOINT CONFERENCE ON DIGITAL LIBRARIES (JCDL), 2022,