SEQUENTIAL CROSS ATTENTION BASED MULTI-TASK LEARNING

被引:3
|
作者
Kim, Sunkyung [1 ]
Choi, Hyesong [1 ]
Min, Dongbo [1 ]
机构
[1] Ewha Womans Univ, Dept Comp Sci & Engn, Seoul, South Korea
关键词
Multi-task learning; self-attention; cross attention; semantic segmentation; monocular depth estimation;
D O I
10.1109/ICIP46576.2022.9897871
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In multi-task learning (MTL) for visual scene understanding, it is crucial to transfer useful information between multiple tasks with minimal interferences. In this paper, we propose a novel architecture that effectively transfers informative features by applying the attention mechanism to the multi-scale features of the tasks. Since applying the attention module directly to all possible features in terms of scale and task requires a high complexity, we propose to apply the attention module sequentially for the task and scale. The cross-task attention module (CTAM) is first applied to facilitate the exchange of relevant information between the multiple task features of the same scale. The cross-scale attention module (CSAM) then aggregates useful information from feature maps at different resolutions in the same task. Also, we attempt to capture long range dependencies through the self-attention module in the feature extraction network. Extensive experiments demonstrate that our method achieves state-of-the-art performance on the NYUD-v2 and PASCAL-Context dataset. Our code is available at https://github.com/kimsunkyung/SCA-MTL
引用
收藏
页码:2311 / 2315
页数:5
相关论文
共 50 条
  • [41] Multimodal Stock Price Forecasting Using Attention Mechanism Based on Multi-Task Learning
    Yang, Haoyan
    WEB AND BIG DATA, PT II, APWEB-WAIM 2023, 2024, 14332 : 454 - 468
  • [43] Fabric Retrieval Based on Multi-Task Learning
    Xiang, Jun
    Zhang, Ning
    Pan, Ruru
    Gao, Weidong
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2021, 30 : 1570 - 1582
  • [44] Multi-task Learning Based Skin Segmentation
    Tan, Taizhe
    Shan, Zhenghao
    KNOWLEDGE SCIENCE, ENGINEERING AND MANAGEMENT, PT III, KSEM 2023, 2023, 14119 : 360 - 369
  • [45] Multi-Task Learning Based Network Embedding
    Wang, Shanfeng
    Wang, Qixiang
    Gong, Maoguo
    FRONTIERS IN NEUROSCIENCE, 2020, 13
  • [46] Graph-based Multi-task Learning
    Li, Ya
    Tian, Xinmei
    2015 IEEE 16TH INTERNATIONAL CONFERENCE ON COMMUNICATION TECHNOLOGY (ICCT), 2015, : 730 - 733
  • [47] AL-Net: Attention Learning Network Based on Multi-Task Learning for Cervical Nucleus Segmentation
    Zhao, Jing
    He, Yong-Jun
    Zhao, Si-Qi
    Huang, Jin-Jie
    Zuo, Wang-Meng
    IEEE JOURNAL OF BIOMEDICAL AND HEALTH INFORMATICS, 2022, 26 (06) : 2693 - 2702
  • [48] A Multi-domain Sentiment Classification model based on Adversarial Multi-task learning and attention mechanisms
    Li, Xinyu
    Jin, Ning
    Yan, Ke
    2022 IEEE INTL CONF ON DEPENDABLE, AUTONOMIC AND SECURE COMPUTING, INTL CONF ON PERVASIVE INTELLIGENCE AND COMPUTING, INTL CONF ON CLOUD AND BIG DATA COMPUTING, INTL CONF ON CYBER SCIENCE AND TECHNOLOGY CONGRESS (DASC/PICOM/CBDCOM/CYBERSCITECH), 2022, : 509 - 516
  • [49] Asymmetric Multi-task Learning Based on Task Relatedness and Loss
    Lee, Giwoong
    Yang, Eunho
    Hwang, Sung Ju
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 48, 2016, 48
  • [50] Enhancing stance detection through sequential weighted multi-task learning
    Alturayeif, Nora
    Luqman, Hamzah
    Ahmed, Moataz
    SOCIAL NETWORK ANALYSIS AND MINING, 2023, 14 (01)