Self-attentive Biaffine Dependency Parsing

被引:0
|
作者
Li, Ying [1 ]
Li, Zhenghua [1 ]
Zhang, Min [1 ]
Wang, Rui [2 ]
Li, Sheng [1 ]
Si, Luo [2 ]
机构
[1] Soochow Univ, Sch Comp Sci & Technol, Inst Artificial Intelligence, Suzhou, Peoples R China
[2] Alibaba Grp, Shenzhen, Peoples R China
基金
中国国家自然科学基金;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The current state-of-the-art dependency parsing approaches employ BiLSTMs to encode input sentences. Motivated by the success of the transformer-based machine translation, this work for the first time applies the self-attention mechanism to dependency parsing as the replacement of BiLSTM, leading to competitive performance on both English and Chinese benchmark data. Based on detailed error analysis, we then combine the power of both BiLSTM and self-attention via model ensembles, demonstrating their complementary capability of capturing contextual information. Finally, we explore the recently proposed contextualized word representations as extra input features, and further improve the parsing performance.
引用
收藏
页码:5067 / 5073
页数:7
相关论文
共 50 条
  • [1] Constituency Parsing with a Self-Attentive Encoder
    Kitaev, Nikita
    Klein, Dan
    PROCEEDINGS OF THE 56TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL), VOL 1, 2018, : 2676 - 2686
  • [2] Korean Dependency Parsing Using Deep Biaffine Dependency Parser
    Cui, Danxin
    Bi, Yude
    2024 INTERNATIONAL CONFERENCE ON ASIAN LANGUAGE PROCESSING, IALP 2024, 2024, : 451 - 455
  • [3] Auxiliary Tasks to Boost Biaffine Semantic Dependency Parsing
    Candito, Marie
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), 2022, : 2422 - 2429
  • [4] Self-Attentive Associative Memory
    Le, Hung
    Tran, Truyen
    Venkatesh, Svetha
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 119, 2020, 119
  • [5] Biaffine Dependency and Semantic Graph Parsing for Enhanced Universal Dependencies
    Attardi, Giuseppe
    Sartiano, Daniele
    Simi, Maria
    IWPT 2021: THE 17TH INTERNATIONAL CONFERENCE ON PARSING TECHNOLOGIES: PROCEEDINGS OF THE CONFERENCE (INCLUDING THE IWPT 2021 SHARED TASK), 2021, : 184 - 188
  • [6] Self-Attentive Sequential Recommendation
    Kang, Wang-Cheng
    McAuley, Julian
    2018 IEEE INTERNATIONAL CONFERENCE ON DATA MINING (ICDM), 2018, : 197 - 206
  • [7] On the Robustness of Self-Attentive Models
    Hsieh, Yu-Lun
    Cheng, Minhao
    Juan, Da-Cheng
    Wei, Wei
    Hsu, Wen-Lian
    Hsieh, Cho-Jui
    57TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2019), 2019, : 1520 - 1529
  • [8] End-to-End Argument Mining as Biaffine Dependency Parsing
    Ye, Yuxiao
    Teufel, Simone
    16TH CONFERENCE OF THE EUROPEAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (EACL 2021), 2021, : 669 - 678
  • [9] Denoising Self-Attentive Sequential Recommendation
    Chen, Huiyuan
    Lin, Yusan
    Pan, Menghai
    Wang, Lan
    Yeh, Chin-Chia Michael
    Li, Xiaoting
    Zheng, Yan
    Wang, Fei
    Yang, Hao
    PROCEEDINGS OF THE 16TH ACM CONFERENCE ON RECOMMENDER SYSTEMS, RECSYS 2022, 2022, : 92 - 101
  • [10] SAED: self-attentive energy disaggregation
    Virtsionis-Gkalinikis, Nikolaos
    Nalmpantis, Christoforos
    Vrakas, Dimitris
    MACHINE LEARNING, 2023, 112 (11) : 4081 - 4100