Estimating the influence of auxiliary tasks for multi-task learning of sequence tagging tasks

被引:0
|
作者
Schroeder, Fynn [1 ]
Biemann, Chris [1 ]
机构
[1] Univ Hamburg, Language Technol Grp, Hamburg, Germany
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Multi-task learning (MTL) and transfer learning (TL) are techniques to overcome the issue of data scarcity when training state-of-the-art neural networks. However, finding beneficial auxiliary datasets for MTL or TL is a time- and resource-consuming trial-and-error approach. We propose new methods to automatically assess the similarity of sequence tagging datasets to identify beneficial auxiliary data for MTL or TL setups. Our methods can compute the similarity between any two sequence tagging datasets, i.e. they do not need to be annotated with the same tagset or multiple labels in parallel. Additionally, our methods take tokens and their labels into account, which is more robust than only using either of them as an information source, as conducted in prior work. We empirically show that our similarity measures correlate with the change in test score of neural networks that use the auxiliary dataset for MTL to increase the main task performance. We provide an efficient, open-source implementation.(1)
引用
收藏
页码:2971 / 2985
页数:15
相关论文
共 50 条
  • [1] Multi-task learning with Attention : Constructing auxiliary tasks for learning to learn
    Li, Benying
    Dong, Aimei
    [J]. 2021 IEEE 33RD INTERNATIONAL CONFERENCE ON TOOLS WITH ARTIFICIAL INTELLIGENCE (ICTAI 2021), 2021, : 145 - 152
  • [2] Sample-level weighting for multi-task learning with auxiliary tasks
    Gregoire, Emilie
    Chaudhary, Muhammad Hafeez
    Verboven, Sam
    [J]. APPLIED INTELLIGENCE, 2024, 54 (04) : 3482 - 3501
  • [3] Sample-level weighting for multi-task learning with auxiliary tasks
    Emilie Grégoire
    Muhammad Hafeez Chaudhary
    Sam Verboven
    [J]. Applied Intelligence, 2024, 54 : 3482 - 3501
  • [4] Multi-task Learning with Labeled and Unlabeled Tasks
    Pentina, Anastasia
    Lampert, Christoph H.
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 70, 2017, 70
  • [5] MetaWeighting: Learning to Weight Tasks in Multi-Task Learning
    Mao, Yuren
    Wang, Zekai
    Liu, Weiwei
    Lin, Xuemin
    Xie, Pengtao
    [J]. FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), 2022, : 3436 - 3448
  • [6] TASK AWARE MULTI-TASK LEARNING FOR SPEECH TO TEXT TASKS
    Indurthi, Sathish
    Zaidi, Mohd Abbas
    Lakumarapu, Nikhil Kumar
    Lee, Beomseok
    Han, Hyojung
    Ahn, Seokchan
    Kim, Sangha
    Kim, Chanwoo
    Hwang, Inchul
    [J]. 2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021), 2021, : 7723 - 7727
  • [7] Multi-Task Learning for Dense Prediction Tasks: A Survey
    Vandenhende, Simon
    Georgoulis, Stamatios
    Van Gansbeke, Wouter
    Proesmans, Marc
    Dai, Dengxin
    Van Gool, Luc
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2022, 44 (07) : 3614 - 3633
  • [8] Flexible Clustered Multi-Task Learning by Learning Representative Tasks
    Zhou, Qiang
    Zhao, Qi
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2016, 38 (02) : 266 - 278
  • [9] Multi-Task Learning for Voice Related Recognition Tasks
    Montalvo, Ana
    Calvo, Jose R.
    Bonastre, Jean-Francois
    [J]. INTERSPEECH 2020, 2020, : 2997 - 3001
  • [10] Unveiling Groups of Related Tasks in Multi-Task Learning
    Frecon, Jordan
    Salzo, Saverio
    Pontil, Massimiliano
    [J]. 2020 25TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2021, : 7134 - 7141