Exploring Sensory Knowledge and Pre-training Language Models for Chinese Metaphor Detection

被引:0
|
作者
Zhao, Qingqing [1 ]
Xiang, Xue [2 ]
Wang, Zhongqing [2 ]
机构
[1] Chinese Acad Social Sci, Inst Linguist, Beijing, Peoples R China
[2] Soochow Univ, Sch Comp Sci & Technol, Suzhou, Peoples R China
关键词
Metaphor detection; Sensory experiences; Neural network model; Mandarin Chinese;
D O I
10.1109/IALP63756.2024.10661181
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Processing of metaphorical languages has been one of the most challenging tasks in Natural Language Processing (NLP). Recent work utilizing neural models has received notable results on metaphor detection. However, the characteristic of metaphorical languages showing an intimate relationship with sensory experiences has not been given enough attention in NLP. This study proposes an innovative model for the task of Chinese metaphor detection by incorporating the conceptual knowledge of sensory experiences into a neural network model. Experiments show that our model outperforms the state-of-the-art baseline models significantly, hence contributing to the ongoing efforts to incorporate neuro-cognitive data in NLP tasks. In addition, the effectiveness of our model helps to deepen our understanding of metaphor that sensory experiences form the crucial part of the embodied nature of metaphorical languages.
引用
收藏
页码:120 / 126
页数:7
相关论文
共 50 条
  • [31] Knowledge-aware multimodal pre-training for fake news detection
    Zhang, Litian
    Zhang, Xiaoming
    Zhou, Ziyi
    Zhang, Xi
    Yu, Philip S.
    Li, Chaozhuo
    INFORMATION FUSION, 2025, 114
  • [32] Realistic Channel Models Pre-training
    Huangfu, Yourui
    Wang, Jian
    Xu, Chen
    Li, Rong
    Ge, Yiqun
    Wang, Xianbin
    Zhang, Huazi
    Wang, Jun
    2019 IEEE GLOBECOM WORKSHOPS (GC WKSHPS), 2019,
  • [33] Multi-Grained Topological Pre-Training of Language Models in Sponsored Search
    Tian, Zhoujin
    Li, Chaozhuo
    Zuo, Zhiqiang
    Wen, Zengxuan
    Hu, Xinyue
    Han, Xiao
    Huang, Haizhen
    Wang, Senzhang
    Deng, Weiwei
    Xie, Xing
    Zhang, Qi
    PROCEEDINGS OF THE 46TH INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL, SIGIR 2023, 2023, : 2189 - 2193
  • [34] Evaluation of pre-training large language models on leadership-class supercomputers
    Yin, Junqi
    Dash, Sajal
    Gounley, John
    Wang, Feiyi
    Tourassi, Georgia
    JOURNAL OF SUPERCOMPUTING, 2023, 79 (18): : 20747 - 20768
  • [35] JiuZhou: open foundation language models and effective pre-training framework for geoscience
    Chen, Zhou
    Lin, Ming
    Zang, Mingrun
    Wang, Zimeng
    Li, Juanzi
    Bai, Yuqi
    INTERNATIONAL JOURNAL OF DIGITAL EARTH, 2025, 18 (01)
  • [36] Length-Based Curriculum Learning for Efficient Pre-training of Language Models
    Nagatsuka, Koichi
    Broni-Bediako, Clifford
    Atsumi, Masayasu
    NEW GENERATION COMPUTING, 2023, 41 (01) : 109 - 134
  • [37] Length-Based Curriculum Learning for Efficient Pre-training of Language Models
    Koichi Nagatsuka
    Clifford Broni-Bediako
    Masayasu Atsumi
    New Generation Computing, 2023, 41 : 109 - 134
  • [38] Task-adaptive Pre-training of Language Models withWord Embedding Regularization
    Nishida, Kosuke
    Nishida, Kyosuke
    Yoshida, Sen
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL-IJCNLP 2021, 2021, : 4546 - 4553
  • [39] INGENIOUS: Using Informative Data Subsets for Efficient Pre-Training of Language Models
    Renduchintala, H. S. V. N. S. Kowndinya
    Killamsetty, Krishnateja
    Bhatia, Sumit
    Aggarwal, Milan
    Ramakrishnan, Ganesh
    Iyer, Rishabh
    Krishnamurthy, Balaji
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS - EMNLP 2023, 2023, : 6690 - 6705
  • [40] Code Smell Detection Research Based on Pre-training and Stacking Models
    Zhang, Dongwen
    Song, Shuai
    Zhang, Yang
    Liu, Haiyang
    Shen, Gaojie
    IEEE LATIN AMERICA TRANSACTIONS, 2024, 22 (01) : 22 - 30