Multi-task learning with contextual hierarchical attention for Korean coreference resolution

被引:1
|
作者
Park, Cheoneum [1 ]
机构
[1] AIRS Co, Hyundai Motor Grp, Seoul, South Korea
关键词
coreference resolution; hierarchical model; head-final language; multi-task learning; pointer network;
D O I
10.4218/etrij.2021-0293
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Coreference resolution is a task in discourse analysis that links several headwords used in any document object. We suggest pointer networks-based coreference resolution for Korean using multi-task learning (MTL) with an attention mechanism for a hierarchical structure. As Korean is a head-final language, the head can easily be found. Our model learns the distribution by referring to the same entity position and utilizes a pointer network to conduct coreference resolution depending on the input headword. As the input is a document, the input sequence is very long. Thus, the core idea is to learn the word- and sentence-level distributions in parallel with MTL, while using a shared representation to address the long sequence problem. The suggested technique is used to generate word representations for Korean based on contextual information using pre-trained language models for Korean. In the same experimental conditions, our model performed roughly 1.8% better on CoNLL F1 than previous research without hierarchical structure.
引用
收藏
页码:93 / 104
页数:12
相关论文
共 50 条
  • [1] Constrained Multi-Task Learning for Event Coreference Resolution
    Lu, Jing
    Ng, Vincent
    [J]. 2021 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL-HLT 2021), 2021, : 4504 - 4514
  • [2] Multi-Task Learning for Contextual Bandits
    Deshmukh, Aniket Anand
    Dogan, Urun
    Scott, Clayton
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 30 (NIPS 2017), 2017, 30
  • [3] Federated Multi-task Learning with Hierarchical Attention for Sensor Data Analytics
    Chen, Yujing
    Ning, Yue
    Chai, Zheng
    Rangwala, Huzefa
    [J]. 2020 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2020,
  • [4] Hierarchical Prompt Learning for Multi-Task Learning
    Liu, Yajing
    Lu, Yuning
    Liu, Hao
    An, Yaozu
    Xu, Zhuoran
    Yao, Zhuokun
    Zhang, Baofeng
    Xiong, Zhiwei
    Gui, Chenguang
    [J]. 2023 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2023, : 10888 - 10898
  • [5] Hierarchical Inter-Attention Network for Document Classification with Multi-Task Learning
    Tian, Bing
    Zhang, Yong
    Wang, Jin
    Xing, Chunxiao
    [J]. PROCEEDINGS OF THE TWENTY-EIGHTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2019, : 3569 - 3575
  • [6] Pedestrian Attribute Recognition via Hierarchical Multi-task Learning and Relationship Attention
    Gao, Lian
    Huang, Di
    Guo, Yuanfang
    Wang, Yunhong
    [J]. PROCEEDINGS OF THE 27TH ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA (MM'19), 2019, : 1340 - 1348
  • [7] Contextual interference: Single task versus multi-task learning
    Maslovat, D
    Chua, R
    Lee, TD
    Franks, IM
    [J]. MOTOR CONTROL, 2004, 8 (02) : 213 - 233
  • [8] Multi-energy load forecasting via hierarchical multi-task learning and spatiotemporal attention
    Song, Cairong
    Yang, Haidong
    Cai, Jianyang
    Yang, Pan
    Bao, Hao
    Xu, Kangkang
    Meng, Xian-Bing
    [J]. APPLIED ENERGY, 2024, 373
  • [9] HFedMTL: Hierarchical Federated Multi-Task Learning
    Yi, Xingfu
    Li, Rongpeng
    Peng, Chenghui
    Wu, Jianjun
    Zhao, Zhifeng
    [J]. 2022 IEEE 33RD ANNUAL INTERNATIONAL SYMPOSIUM ON PERSONAL, INDOOR AND MOBILE RADIO COMMUNICATIONS (IEEE PIMRC), 2022,
  • [10] Compressed Hierarchical Representations for Multi-Task Learning and Task Clustering
    de Freitas, Joao Machado
    Berg, Sebastian
    Geiger, Bernhard C.
    Muecke, Manfred
    [J]. 2022 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2022,