Word-level and phrase-level strategies for figurative text identification

被引:0
|
作者
Yang, Qimeng [1 ]
Yu, Long [2 ]
Tian, Shengwei [3 ]
Song, Jinmiao [1 ]
机构
[1] Xinjiang Univ, Coll Informat Sci & Engn, Urumqi 830000, CO, Peoples R China
[2] Xinjiang Univ, Network Ctr, Urumqi 830000, CO, Peoples R China
[3] Xinjiang Univ, Sch Software, Urumqi 830000, CO, Peoples R China
基金
中国国家自然科学基金;
关键词
Metaphor detection; Attention mechanism; Phrase level; Word level; Deep learning; EMPIRICAL MODE DECOMPOSITION;
D O I
10.1007/s11042-022-12233-3
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Metaphors are a common language tool used in communication that widely exist in natural language. Metaphor detection is beneficial to improving the performance of natural language processing tasks such as machine translation. The current metaphor detection method captures the semantic incoherence of the metaphorical word and its surrounding text. However, it ignores the different contributions of phrases and words. In addition, word information and its contextual information should be deeply integrated. We propose a learning framework that combines word representations, contextual representations and combined representations. Specifically, we establish word-level and phrase-level attention mechanisms to learn enhanced feature representations. For the word-level attention, we extract word embedding, part-of-speech (POS) and distance features as multilevel representations and use multi-attention to obtain weight information. For the phrase-level attention, the IndRNN and self-attention are employed to obtain the deep semantic representation of the sentence. By using this strategy, our model has the ability to mine richer semantic representations. Experiments on the VUA metaphor corpus show that our method achieves an F-score of 69.7% on the test dataset, thereby surpassing the scores of all the existing models by a considerable margin.
引用
收藏
页码:14339 / 14353
页数:15
相关论文
共 50 条
  • [1] Word-level and phrase-level strategies for figurative text identification
    Qimeng Yang
    Long Yu
    Shengwei Tian
    Jinmiao Song
    Multimedia Tools and Applications, 2022, 81 : 14339 - 14353
  • [2] Phrase-level and word-level syllables: Resyllabification and prosodization of clitics
    Cardinaletti, Anna
    Repetti, Lori
    PHONOLOGICAL DOMAINS: UNIVERSALS AND DEVIATIONS, 2009, 16 : 79 - 104
  • [3] Assistant diagnosis with Chinese electronic medical records based on CNN and BiLSTM with phrase-level and word-level attentions
    Tong Wang
    Ping Xuan
    Zonglin Liu
    Tiangang Zhang
    BMC Bioinformatics, 21
  • [4] Assistant diagnosis with Chinese electronic medical records based on CNN and BiLSTM with phrase-level and word-level attentions
    Wang, Tong
    Xuan, Ping
    Liu, Zonglin
    Zhang, Tiangang
    BMC BIOINFORMATICS, 2020, 21 (01)
  • [5] Document and Word-level Language Identification for Noisy User Generated Text
    Kozhirbayev, Zhanibek
    Yessenbayev, Zhandos
    Makazhanov, Aibek
    2018 IEEE 12TH INTERNATIONAL CONFERENCE ON APPLICATION OF INFORMATION AND COMMUNICATION TECHNOLOGIES (AICT), 2018, : 124 - 127
  • [6] Phrase-Level Combination of SMT and TM Using Constrained Word Lattice
    Li, Liangyou
    Way, Andy
    Liu, Qun
    PROCEEDINGS OF THE 54TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2016), VOL 2, 2016, : 275 - 280
  • [7] Neural mechanisms underlying word- and phrase-level morphological parsing
    Leminen, Alina
    Jakonen, Sini
    Leminen, Miika
    Makela, Jyrki P.
    Lehtonen, Minna
    JOURNAL OF NEUROLINGUISTICS, 2016, 38 : 26 - 41
  • [8] Phrase-level and edge marking in Drehu
    Torres, Catalina
    Fletcher, Janet
    GLOSSA-A JOURNAL OF GENERAL LINGUISTICS, 2022, 7 (01):
  • [9] Few-shot learning for word-level scene text script identification
    Naosekpam, Veronica
    Sahu, Nilkanta
    COMPUTATIONAL INTELLIGENCE, 2024, 40 (01)
  • [10] Representing Multiword Chemical Terms through Phrase-Level Preprocessing and Word Embedding
    Huang, Liyuan
    Ling, Chen
    ACS OMEGA, 2019, 4 (20): : 18510 - 18519