Knowledge-Enhanced Graph Encoding Method for Metaphor Detection in Text

被引:0
|
作者
Huang H. [1 ,2 ,3 ]
Liu X. [1 ,2 ,3 ]
Liu Q. [1 ,2 ,3 ]
机构
[1] School of Computer Science and Technology, Beijing Institute of Technology, Beijing
[2] Beijing Engineering Research Center of High-Volume Language Information Processing and Cloud Computing Applications, Beijing
[3] Southeast Academy of Information Technology, Beijing Institute of Technology, Putian
基金
中国国家自然科学基金;
关键词
Fine-grained syntactic knowledge; Graph recurrent neural network; Knowledge-enhanced method; Metaphor detection; Sequence labeling; Word sense knowledge;
D O I
10.7544/issn1000-1239.202110927
中图分类号
学科分类号
摘要
Metaphor recognition is one of the essential tasks of semantic understanding in natural language processing, aiming to identify whether one concept is viewed in terms of the properties and characteristics of the other. Since pure neural network methods are restricted by the scale of datasets and the sparsity of human annotations, recent researchers working on metaphor recognition explore how to combine the knowledge in other tasks and coarse-grained syntactic knowledge with neural network models, obtaining more effective feature vectors for sequence coding and modeling in text. However, the existing methods ignore the word sense knowledge and fine-grained syntactic knowledge, resulting in the problem of low utilization of external knowledge and the difficulty to model complex context. Aiming at the above issues, a knowledge-enhanced graph encoding method (KEG) for metaphor detection in text is proposed. This method consists of three parts. In the encoding layer, the sense vector is trained using the word sense knowledge, combined with the context vector generated by the pre-training model to enhance the semantic representation. In the graph layer, the information graph is constructed using fine-grained syntactic knowledge, and then the fine-grained context is calculated. The layer is combined with the graph recurrent neural network, whose state transition is carried out iteratively to obtain the node vector and the global vector representing the word and the sentence, respectively, to realize the efficient modeling of the complex context. In the decoding layer, conditional random fields are used to decode the sequence tags following the sequence labeling architecture. Experimental results show that this method effectively improves the performance on four international public datasets. © 2023, Science Press. All right reserved.
引用
收藏
页码:140 / 152
页数:12
相关论文
共 42 条
  • [1] Shutova E., Models of metaphor in NLP[C], Proc of the 48th Annual Meeting of the Association for Computational Linguistics, pp. 688-697, (2010)
  • [2] Jang H, Moon S, Jo Y, Et al., Metaphor detection in discourse[C], Proc of the 16th Annual Meeting of the Special Interest Group on Discourse and Dialogue, pp. 384-392, (2015)
  • [3] Klebanov B, Leong C, Gutierrez E, Et al., Semantic classifications for detection of verb metaphors[C], Proc of the 54th Annual Meeting of the Association for Computational Linguistics, pp. 101-106, (2016)
  • [4] Bulat L, Clark S, Shutova E., Modelling metaphor with attribute-based semantics[C], Proc of the 15th Conf of the European Chapter of the Association for Computational Linguistics, pp. 523-528, (2017)
  • [5] Shutova E, Kiela D, Maillard J., Black holes and white rabbits: Metaphor identification with visual features[C], Proc of the 2016 Conf of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pp. 160-170, (2016)
  • [6] Mao Rui, Lin Chenghua, Guerin F., End-to-end sequential metaphor identification inspired by linguistic theories, Proc of the 57th Annual Meeting of the Association for Computational Linguistics, pp. 3888-3898, (2019)
  • [7] Gao Ge, Choi E, Choi Y, Et al., Neural metaphor detection in context[C], Proc of the 2018 Conf on Empirical Methods in Natural Language Processing, pp. 607-613, (2018)
  • [8] Rei M, Bulat L, Kiela D, Et al., Grasping the finer point: A supervised similarity network for metaphor detection[C], Proc of the 2017 Conf on Empirical Methods in Natural Language Processing, pp. 1537-1546, (2017)
  • [9] Chen Xianyang, Leong C, Flor M, Et al., Go Figure! Multi-task transformer-based architecture for metaphor detection using idioms: ETS team in 2020 metaphor shared task[C], Proc of the 2nd Workshop on Figurative Language Processing, Fig-Lang@ACL 2020, pp. 235-243, (2020)
  • [10] Su Chuandong, Fukumoto F, Huang Xiaoxi, Et al., DeepMet: A reading comprehension paradigm for token-level metaphor detection[C], Proc of the 2nd Workshop on Figurative Language Processing, Fig-Lang@ACL 2020, pp. 30-39, (2020)