Multi-Task Active Learning with Output Constraints

被引:0
|
作者
Zhang, Yi [1 ]
机构
[1] Carnegie Mellon Univ, Machine Learning Dept, Pittsburgh, PA 15213 USA
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Many problems in information extraction, text mining, natural language processing and other fields exhibit the same property: multiple prediction tasks are related in the sense that their outputs (labels) satisfy certain constraints. In this paper, we propose an active learning framework exploiting such relations among tasks. Intuitively, with task outputs coupled by constraints, active learning can utilize not only the uncertainty of the prediction in a single task but also the inconsistency of predictions across tasks. We formalize this idea as a cross-task value of information criteria, in which the reward of a labeling assignment is propagated and measured over all relevant tasks reachable through constraints. A specific example of our framework leads to the cross entropy measure on the predictions of coupled tasks, which generalizes the entropy in the classical single-task uncertain sampling. We conduct experiments on two real-world problems: web information extraction and document classification. Empirical results demonstrate the effectiveness of our framework in actively collecting labeled examples for multiple related tasks.
引用
收藏
页码:667 / 672
页数:6
相关论文
共 50 条
  • [1] Active Multi-Task Representation Learning
    Chen, Yifang
    Du, Simon S.
    Jamieson, Kevin
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,
  • [2] Multi-Task Consistency for Active Learning
    Hekimoglu, Aral
    Friedrich, Philipp
    Zimmer, Walter
    Schmidt, Michael
    Marcos-Ramiro, Alvaro
    Knoll, Alois
    2023 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION WORKSHOPS, ICCVW, 2023, : 3407 - 3416
  • [3] Learning output kernels for multi-task problems
    Dinuzzo, Francesco
    NEUROCOMPUTING, 2013, 118 : 119 - 126
  • [4] An efficient active learning method for multi-task learning
    Xiao, Yanshan
    Chang, Zheng
    Liu, Bo
    KNOWLEDGE-BASED SYSTEMS, 2020, 190
  • [5] ACTIVE LEARNING FOR SEMI-SUPERVISED MULTI-TASK LEARNING
    Li, Hui
    Liao, Xuejun
    Carin, Lawrence
    2009 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, VOLS 1- 8, PROCEEDINGS, 2009, : 1637 - +
  • [6] Multi-task gradient descent for multi-task learning
    Bai, Lu
    Ong, Yew-Soon
    He, Tiantian
    Gupta, Abhishek
    MEMETIC COMPUTING, 2020, 12 (04) : 355 - 369
  • [7] Multi-task gradient descent for multi-task learning
    Lu Bai
    Yew-Soon Ong
    Tiantian He
    Abhishek Gupta
    Memetic Computing, 2020, 12 : 355 - 369
  • [8] Multi-task learning models for predicting active compounds
    Zhao, Zhili
    Qin, Jian
    Gou, Zhuoyue
    Zhang, Yanan
    Yang, Yi
    JOURNAL OF BIOMEDICAL INFORMATICS, 2020, 108
  • [9] Learning Shared Safety Constraints from Multi-task Demonstrations
    Kim, Konwoo
    Swamy, Gokul
    Liu, Zuxin
    Zhao, Ding
    Choudhury, Sanjiban
    Wu, Zhiwei Steven
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [10] Data Selection for Multi-Task Learning Under Dynamic Constraints
    Capone, Alexandre
    Lederer, Armin
    Umlauft, Jonas
    Hirche, Sandra
    IEEE CONTROL SYSTEMS LETTERS, 2021, 5 (03): : 959 - 964