Chinese Coreference Resolution via Bidirectional LSTMs using Word and Token Level Representations

被引:3
|
作者
Ming, Kun [1 ]
机构
[1] Beijing Inst Technol, Sch Comp Sci & Technol, Beijing, Peoples R China
关键词
Chinese coreference resolution; BERT; bidirectional LSTM;
D O I
10.1109/CIS52066.2020.00024
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Coreference resolution is an important task in the field of natural language processing. Most existing methods usually utilize word-level representations, ignoring massive information from the texts. To address this issue, we investigate how to improve Chinese coreference resolution by using span-level semantic representations. Specifically, we propose a model which acquires word and character representations through pre-trained Skip-Gram embeddings and pre-trained BERT, then explicitly leverages span-level information by performing bidirectional LSTMs among above representations. Experiments on CoNLL-2012 shared task have demonstrated that the proposed model achieves 62.95% F1-score, outperforming our baseline methods.
引用
收藏
页码:73 / 76
页数:4
相关论文
共 30 条
  • [1] Word-Level Coreference Resolution
    Dobrovolskii, Vladimir
    2021 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2021), 2021, : 7670 - 7675
  • [2] Improving Coreference Resolution by Learning Entity-Level Distributed Representations
    Clark, Kevin
    Manning, Christopher D.
    PROCEEDINGS OF THE 54TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, VOL 1, 2016, : 643 - 653
  • [3] COREFERENCE RESOLUTION FOR PORTUGUESE USING PARALLEL CORPORA WORD ALIGNMENT
    De Souza, Jose Guilherme Camargo
    Orasan, Constantin
    KEPT 2011: KNOWLEDGE ENGINEERING PRINCIPLES AND TECHNIQUES, 2011, : 61 - 70
  • [4] Anchor Word Extraction Enhancement using Coreference Resolution for Narrative texts
    Panwar, Neetika
    Ramasangu, Hariharan
    2019 IEEE 16TH INDIA COUNCIL INTERNATIONAL CONFERENCE (IEEE INDICON 2019), 2019,
  • [5] Relation Extraction via Attention-Based CNNs using Token-Level Representations
    Wang, Yan
    Xin, Xin
    Guo, Ping
    2019 15TH INTERNATIONAL CONFERENCE ON COMPUTATIONAL INTELLIGENCE AND SECURITY (CIS 2019), 2019, : 113 - 117
  • [6] Joint Document-Level Event Extraction via Token-Token Bidirectional Event Completed Graph
    Wan, Qizhi
    Wan, Changxuan
    Xiao, Keli
    Liu, Dexi
    Li, Chenliang
    Zheng, Bolong
    Liu, Xiping
    Hu, Rong
    PROCEEDINGS OF THE 61ST ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2023): LONG PAPERS, VOL 1, 2023, : 10481 - 10492
  • [7] Transition-Based Korean Dependency Parsing Using Hybrid Word Representations of Syllables and Morphemes with LSTMs
    Na, Seung-Hoon
    Li, Jianri
    Shin, Jong-Hoon
    Kim, Kangil
    ACM TRANSACTIONS ON ASIAN AND LOW-RESOURCE LANGUAGE INFORMATION PROCESSING, 2019, 18 (02)
  • [8] Chinese pronomial coreference resolution using decision tree plus filter rules
    Wang, ZQ
    Li, L
    Li, RF
    Zhong, YX
    PROCEEDINGS OF THE 2005 IEEE INTERNATIONAL CONFERENCE ON NATURAL LANGUAGE PROCESSING AND KNOWLEDGE ENGINEERING (IEEE NLP-KE'05), 2005, : 587 - 591
  • [9] Chinese Sentiment Analysis Using Bidirectional LSTM with Word Embedding
    Xiao, Zheng
    Liang, Pijun
    CLOUD COMPUTING AND SECURITY, ICCCS 2016, PT II, 2016, 10040 : 601 - 610
  • [10] Injecting Wiktionary to improve token-level contextual representations using contrastive learning
    Mosolova, Anna
    Candito, Marie
    Ramisch, Carlos
    PROCEEDINGS OF THE 18TH CONFERENCE OF THE EUROPEAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, VOL 2: SHORT PAPERS, 2024, : 34 - 41