Word-level and phrase-level strategies for figurative text identification

被引:0
|
作者
Yang, Qimeng [1 ]
Yu, Long [2 ]
Tian, Shengwei [3 ]
Song, Jinmiao [1 ]
机构
[1] Xinjiang Univ, Coll Informat Sci & Engn, Urumqi 830000, CO, Peoples R China
[2] Xinjiang Univ, Network Ctr, Urumqi 830000, CO, Peoples R China
[3] Xinjiang Univ, Sch Software, Urumqi 830000, CO, Peoples R China
基金
中国国家自然科学基金;
关键词
Metaphor detection; Attention mechanism; Phrase level; Word level; Deep learning; EMPIRICAL MODE DECOMPOSITION;
D O I
10.1007/s11042-022-12233-3
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Metaphors are a common language tool used in communication that widely exist in natural language. Metaphor detection is beneficial to improving the performance of natural language processing tasks such as machine translation. The current metaphor detection method captures the semantic incoherence of the metaphorical word and its surrounding text. However, it ignores the different contributions of phrases and words. In addition, word information and its contextual information should be deeply integrated. We propose a learning framework that combines word representations, contextual representations and combined representations. Specifically, we establish word-level and phrase-level attention mechanisms to learn enhanced feature representations. For the word-level attention, we extract word embedding, part-of-speech (POS) and distance features as multilevel representations and use multi-attention to obtain weight information. For the phrase-level attention, the IndRNN and self-attention are employed to obtain the deep semantic representation of the sentence. By using this strategy, our model has the ability to mine richer semantic representations. Experiments on the VUA metaphor corpus show that our method achieves an F-score of 69.7% on the test dataset, thereby surpassing the scores of all the existing models by a considerable margin.
引用
收藏
页码:14339 / 14353
页数:15
相关论文
共 50 条
  • [31] Overlapping phrase-level translation rules in an SMT engine
    Tribble, A
    Vogel, S
    Waibel, A
    2003 INTERNATIONAL CONFERENCE ON NATURAL LANGUAGE PROCESSING AND KNOWLEDGE ENGINEERING, PROCEEDINGS, 2003, : 574 - 579
  • [32] An Efficient Character-Level and Word-Level Feature Fusion Method for Chinese Text Classification
    Jin Wenzhen
    Zhu Hong
    Yang Guocai
    2019 3RD INTERNATIONAL CONFERENCE ON MACHINE VISION AND INFORMATION TECHNOLOGY (CMVIT 2019), 2019, 1229
  • [33] Phrase-Level Segmentation and Labelling of Machine Translation Errors
    Blain, Frederic
    Logacheva, Varvara
    Specia, Lucia
    LREC 2016 - TENTH INTERNATIONAL CONFERENCE ON LANGUAGE RESOURCES AND EVALUATION, 2016, : 2240 - 2245
  • [34] TC-DWA: Text Clustering with Dual Word-Level Augmentation
    Cheng, Bo
    Li, Ximing
    Chang, Yi
    THIRTY-SEVENTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 37 NO 6, 2023, : 7113 - 7121
  • [35] The word-level prosody of Samoan
    Zuraw, Kie
    Yu, Kristine M.
    Orfitelli, Robyn
    PHONOLOGY, 2014, 31 (02) : 271 - 327
  • [36] Word-Level Coreference Resolution
    Dobrovolskii, Vladimir
    2021 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2021), 2021, : 7670 - 7675
  • [37] A Phrase-Level Attention Enhanced CRF for Keyphrase Extraction
    Li, Shinian
    Jiang, Tao
    Zhang, Yuxiang
    ADVANCES IN INFORMATION RETRIEVAL, ECIR 2024, PT I, 2024, 14608 : 455 - 469
  • [38] Cascaded Segmentation-Detection Networks for Word-Level Text Spotting
    Qin, Siyang
    Manduchi, Roberto
    2017 14TH IAPR INTERNATIONAL CONFERENCE ON DOCUMENT ANALYSIS AND RECOGNITION (ICDAR), VOL 1, 2017, : 1275 - 1282
  • [39] Word-Level Script Identification Using Texture Based Features
    Singh, Pawan Kumar
    Sarkar, Ram
    Nasipuri, Mita
    INTERNATIONAL JOURNAL OF SYSTEM DYNAMICS APPLICATIONS, 2015, 4 (02) : 74 - 94
  • [40] SwitchNet: Learning to switch for word-level language identification in code-mixed social media text
    Sarma, Neelakshi
    Sanasam Singh, Ranbir
    Goswami, Diganta
    NATURAL LANGUAGE ENGINEERING, 2022, 28 (03) : 337 - 359