Word-level and phrase-level strategies for figurative text identification

被引:0
|
作者
Yang, Qimeng [1 ]
Yu, Long [2 ]
Tian, Shengwei [3 ]
Song, Jinmiao [1 ]
机构
[1] Xinjiang Univ, Coll Informat Sci & Engn, Urumqi 830000, CO, Peoples R China
[2] Xinjiang Univ, Network Ctr, Urumqi 830000, CO, Peoples R China
[3] Xinjiang Univ, Sch Software, Urumqi 830000, CO, Peoples R China
基金
中国国家自然科学基金;
关键词
Metaphor detection; Attention mechanism; Phrase level; Word level; Deep learning; EMPIRICAL MODE DECOMPOSITION;
D O I
10.1007/s11042-022-12233-3
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Metaphors are a common language tool used in communication that widely exist in natural language. Metaphor detection is beneficial to improving the performance of natural language processing tasks such as machine translation. The current metaphor detection method captures the semantic incoherence of the metaphorical word and its surrounding text. However, it ignores the different contributions of phrases and words. In addition, word information and its contextual information should be deeply integrated. We propose a learning framework that combines word representations, contextual representations and combined representations. Specifically, we establish word-level and phrase-level attention mechanisms to learn enhanced feature representations. For the word-level attention, we extract word embedding, part-of-speech (POS) and distance features as multilevel representations and use multi-attention to obtain weight information. For the phrase-level attention, the IndRNN and self-attention are employed to obtain the deep semantic representation of the sentence. By using this strategy, our model has the ability to mine richer semantic representations. Experiments on the VUA metaphor corpus show that our method achieves an F-score of 69.7% on the test dataset, thereby surpassing the scores of all the existing models by a considerable margin.
引用
收藏
页码:14339 / 14353
页数:15
相关论文
共 50 条
  • [41] Relating foveal and parafoveal processing efficiency with word-level parameters in text reading
    Heikkila, Timo T.
    Soralinna, Nea
    Hyona, Jukka
    JOURNAL OF MEMORY AND LANGUAGE, 2024, 137
  • [42] Word-level emotion distribution with two schemas for short text emotion classification
    Li, Zongxi
    Xie, Haoran
    Cheng, Gary
    Li, Qing
    KNOWLEDGE-BASED SYSTEMS, 2021, 227
  • [43] Pipelined Neural Networks for Phrase-Level Sentiment Intensity Prediction
    Yu, Liang-Chih
    Wang, Jin
    Lai, K. Robert
    Zhang, Xuejie
    IEEE TRANSACTIONS ON AFFECTIVE COMPUTING, 2020, 11 (03) : 447 - 458
  • [44] Is Word-Level Recursion Actually Recursion?
    Miller, Taylor L.
    Sande, Hannah
    LANGUAGES, 2021, 6 (02)
  • [45] Lifting propositional interpolants to the word-level
    Kroening, Daniel
    Weissenbacher, Georg
    FMCAD 2007: FORMAL METHODS IN COMPUTER AIDED DESIGN, PROCEEDINGS, 2007, : 85 - 89
  • [46] Sentiment Analysis of Urdu Language: Handling Phrase-Level Negation
    Syed, Afraz Zahra
    Aslam, Muhammad
    Maria Martinez-Enriquez, Ana
    ADVANCES IN ARTIFICIAL INTELLIGENCE, PT I, 2011, 7094 : 382 - +
  • [47] Neural Machine Translation with Phrase-Level Universal Visual Representations
    Fang, Qingkai
    Feng, Yang
    PROCEEDINGS OF THE 60TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), VOL 1: (LONG PAPERS), 2022, : 5687 - 5698
  • [48] The Phonetics of Paiwan Word-Level Prosody
    Chen, Chun-Mei
    LANGUAGE AND LINGUISTICS, 2009, 10 (03) : 593 - 625
  • [49] WARP: Word-level Adversarial ReProgramming
    Hambardzumyan, Karen
    Khachatrian, Hrant
    May, Jonathan
    59TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS AND THE 11TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (ACL-IJCNLP 2021), VOL 1, 2021, : 4921 - 4933
  • [50] A word-level graph manipulation package
    Höreth S.
    International Journal on Software Tools for Technology Transfer, 2001, 3 (02) : 182 - 192