Multi-Task Learning With Multi-Query Transformer for Dense Prediction

被引:7
|
作者
Xu, Yangyang [1 ]
Li, Xiangtai [2 ]
Yuan, Haobo [1 ]
Yang, Yibo [3 ]
Zhang, Lefei [1 ,4 ]
机构
[1] Wuhan Univ, Inst Artificial Intelligence, Sch Comp Sci, Wuhan 430072, Peoples R China
[2] Nanyang Technol Univ, S Lab, Singapore 637335, Singapore
[3] JD Explore Acad, Beijing 101111, Peoples R China
[4] Hubei Luojia Lab, Wuhan 430072, Peoples R China
基金
中国国家自然科学基金;
关键词
Scene understanding; multi-task learning; dense prediction; transformers; NETWORK;
D O I
10.1109/TCSVT.2023.3292995
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Previous multi-task dense prediction studies developed complex pipelines such as multi-modal distillations in multiple stages or searching for task relational contexts for each task. The core insight beyond these methods is to maximize the mutual effects of each task. Inspired by the recent query-based Transformers, we propose a simple pipeline named Multi-Query Transformer (MQTransformer) that is equipped with multiple queries from different tasks to facilitate the reasoning among multiple tasks and simplify the cross-task interaction pipeline. Instead of modeling the dense per-pixel context among different tasks, we seek a task-specific proxy to perform cross-task reasoning via multiple queries where each query encodes the task-related context. The MQTransformer is composed of three key components: shared encoder, cross-task query attention module and shared decoder. We first model each task with a task-relevant query. Then both the task-specific feature output by the feature extractor and the task-relevant query are fed into the shared encoder, thus encoding the task-relevant query from the task-specific feature. Secondly, we design a cross-task query attention module to reason the dependencies among multiple task-relevant queries; this enables the module to only focus on the query-level interaction. Finally, we use a shared decoder to gradually refine the image features with the reasoned query features from different tasks. Extensive experiment results on two dense prediction datasets (NYUD-v2 and PASCAL-Context) show that the proposed method is an effective approach and achieves state-of-the-art results.
引用
收藏
页码:1228 / 1240
页数:13
相关论文
共 50 条
  • [41] Learning Multi-Level Task Groups in Multi-Task Learning
    Han, Lei
    Zhang, Yu
    [J]. PROCEEDINGS OF THE TWENTY-NINTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2015, : 2638 - 2644
  • [42] Unified Transformer Multi-Task Learning for Intent Classification With Entity Recognition
    Benayas Alamos, Alberto Jose
    Hashempou, Reyhaneh
    Rumble, Damian
    Jameel, Shoaib
    De Amorim, Renato Cordeiro
    [J]. IEEE ACCESS, 2021, 9 : 147306 - 147314
  • [43] Multi-Task Multi-Sample Learning
    Aytar, Yusuf
    Zisserman, Andrew
    [J]. COMPUTER VISION - ECCV 2014 WORKSHOPS, PT III, 2015, 8927 : 78 - 91
  • [44] Bidirectional Transformer Based Multi-Task Learning for Natural Language Understanding
    Tripathi, Suraj
    Singh, Chirag
    Kumar, Abhay
    Pandey, Chandan
    Jain, Nishant
    [J]. NATURAL LANGUAGE PROCESSING AND INFORMATION SYSTEMS (NLDB 2019), 2019, 11608 : 54 - 65
  • [45] An overview of multi-task learning
    Zhang, Yu
    Yang, Qiang
    [J]. NATIONAL SCIENCE REVIEW, 2018, 5 (01) : 30 - 43
  • [46] Boosted multi-task learning
    Olivier Chapelle
    Pannagadatta Shivaswamy
    Srinivas Vadrevu
    Kilian Weinberger
    Ya Zhang
    Belle Tseng
    [J]. Machine Learning, 2011, 85 : 149 - 173
  • [47] On Partial Multi-Task Learning
    He, Yi
    Wu, Baijun
    Wu, Di
    Wu, Xindong
    [J]. ECAI 2020: 24TH EUROPEAN CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2020, 325 : 1174 - 1181
  • [48] Federated Multi-Task Learning
    Smith, Virginia
    Chiang, Chao-Kai
    Sanjabi, Maziar
    Talwalkar, Ameet
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 30 (NIPS 2017), 2017, 30
  • [49] Pareto Multi-Task Learning
    Lin, Xi
    Zhen, Hui-Ling
    Li, Zhenhua
    Zhang, Qingfu
    Kwong, Sam
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [50] Asynchronous Multi-Task Learning
    Baytas, Inci M.
    Yan, Ming
    Jain, Anil K.
    Zhou, Jiayu
    [J]. 2016 IEEE 16TH INTERNATIONAL CONFERENCE ON DATA MINING (ICDM), 2016, : 11 - 20