Mention detection in Turkish coreference resolution

被引:0
|
作者
Demir, Seniz [1 ]
Akdag, Hanifi Ibrahim [1 ]
机构
[1] MEF Univ, Dept Comp Engn, Istanbul, Turkiye
关键词
Coreference resolution; mention detection; neural network; language model; Turkish;
D O I
10.55730/1300-0632.4095
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
A crucial step in understanding natural language is detecting mentions that refer to real-world entities in a text and correctly identifying their boundaries. Mention detection is commonly considered a preprocessing step in coreference resolution which is shown to be helpful in several language processing applications such as machine translation and text summarization. Despite recent efforts on Turkish coreference resolution, no standalone neural solution to mention detection has been proposed yet. In this article, we present two models designed for detecting Turkish mentions by using feed-forward neural networks. Both models extract all spans up to a fixed length from input text as candidates and classify them as mentions or not mentions. The models differ in terms of how candidate text spans are represented. The first model represents a span by focusing on its first and last words, whereas the representation also covers the preceding and proceeding words of a span in the second model. Mention span representations are formed by using contextual embeddings, part-of-speech embeddings, and named-entity embeddings of words in interest where contextual embeddings are obtained from pretrained Turkish language models. In our evaluation studies, we not only assess the impact of mention representation strategies on system performance but also demonstrate the usability of different pretrained language models in resolution task. We argue that our work provides useful insights to the existing literature and the first step in understanding the effectiveness of neural architectures in Turkish mention detection.
引用
收藏
页码:682 / 697
页数:17
相关论文
共 50 条
  • [1] Mention detection in coreference resolution: survey
    Lata, Kusum
    Singh, Pardeep
    Dutta, Kamlesh
    [J]. APPLIED INTELLIGENCE, 2022, 52 (09) : 9816 - 9860
  • [2] Mention detection in coreference resolution: survey
    Kusum Lata
    Pardeep Singh
    Kamlesh Dutta
    [J]. Applied Intelligence, 2022, 52 : 9816 - 9860
  • [3] Mention Detection Using Pointer Networks for Coreference Resolution
    Park, Cheoneum
    Lee, Changki
    Lim, Soojong
    [J]. ETRI JOURNAL, 2017, 39 (05) : 652 - 661
  • [4] Neural Coreference Resolution with Deep Biaffine Attention by Joint Mention Detection and Mention Clustering
    Zhang, Rui
    dos Santos, Cicero Nogueira
    Yasunaga, Michihiro
    Xiang, Bing
    Radev, Dragomir R.
    [J]. PROCEEDINGS OF THE 56TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, VOL 2, 2018, : 102 - 107
  • [5] Turkish Coreference Resolution
    Pamay, Tugba
    Eryigit, Gulsen
    [J]. 2018 INNOVATIONS IN INTELLIGENT SYSTEMS AND APPLICATIONS (INISTA), 2018,
  • [6] Using Mention Accessibility to Improve Coreference Resolution
    Webster, Kellie
    Nothman, Joel
    [J]. PROCEEDINGS OF THE 54TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2016), VOL 2, 2016, : 432 - 437
  • [7] Mention Detection for Improving Coreference Resolution in Russian Texts: A Machine Learning Approach
    Toldova, Svetlana
    Ionov, Max
    [J]. COMPUTACION Y SISTEMAS, 2016, 20 (04): : 681 - 696
  • [8] Mention Clustering to Improve Portuguese Semantic Coreference Resolution
    Fonseca, Evandro
    Vanin, Aline
    Vieira, Renata
    [J]. NATURAL LANGUAGE PROCESSING AND INFORMATION SYSTEMS (NLDB 2018), 2018, 10859 : 256 - 263
  • [9] Random Walks Down the Mention Graphs for Event Coreference Resolution
    Chen, Bin
    Su, Jian
    Tan, Chew Lim
    [J]. ACM TRANSACTIONS ON INTELLIGENT SYSTEMS AND TECHNOLOGY, 2013, 4 (04)
  • [10] Towards a Mention-Pair Model for Coreference Resolution in Portuguese
    Rocha, Gil
    Cardoso, Henrique Lopes
    [J]. PROGRESS IN ARTIFICIAL INTELLIGENCE (EPIA 2017), 2017, 10423 : 855 - 867