Where and How to Transfer: Knowledge Aggregation-Induced Transferability Perception for Unsupervised Domain Adaptation

被引:99
|
作者
Dong, Jiahua [1 ,2 ,3 ]
Cong, Yang [1 ,2 ]
Sun, Gan [1 ,2 ]
Fang, Zhen [4 ]
Ding, Zhengming [5 ]
机构
[1] Chinese Acad Sci, Shenyang Inst Automat, State Key Lab Robot, Shenyang 110016, Liaoning, Peoples R China
[2] Chinese Acad Sci, Inst Robot & Intelligent Mfg, Shenyang 110169, Liaoning, Peoples R China
[3] Univ Chinese Acad Sci, Beijing 100049, Peoples R China
[4] Univ Technol Sydney, Australian Artificial Intelligence Inst, Ultimo, NSW 2007, Australia
[5] Tulane Univ, Dept Comp Sci, New Orleans, LA 70118 USA
关键词
Transfer learning; unsupervised domain adaptation; semantic segmentation; medical lesions diagnosis; REGULARIZATION; FRAMEWORK;
D O I
10.1109/TPAMI.2021.3128560
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Unsupervised domain adaptation without accessing expensive annotation processes of target data has achieved remarkable successes in semantic segmentation. However, most existing state-of-the-art methods cannot explore whether semantic representations across domains are transferable or not, which may result in the negative transfer brought by irrelevant knowledge. To tackle this challenge, in this paper, we develop a novel Knowledge Aggregation-induced Transferability Perception (KATP) module for unsupervised domain adaptation, which is a pioneering attempt to distinguish transferable or untransferable knowledge across domains. Specifically, the KATP module is designed to quantify which semantic knowledge across domains is transferable, by incorporating the transferability information propagation from constructed global category-wise prototypes. Based on KATP, we design a novel KATP Adaptation Network (KATPAN) to determine where and how to transfer. The KATPAN contains a transferable appearance translation module T-A(& sdot;) and a transferable representation augmentation module T-R(& sdot;) , where both modules construct a virtuous circle of performance promotion. T-A(& sdot;) develops a transferability-aware information bottleneck to highlight where to adapt transferable visual characterizations and modality information; T-R(& sdot;) explores how to augment transferable representations while abandoning untransferable information, and promotes the translation performance of T-A(& sdot;) in return. Comprehensive experiments on several representative benchmark datasets and a medical dataset support the state-of-the-art performance of our model.
引用
收藏
页码:1664 / 1681
页数:18
相关论文
共 50 条
  • [1] Balancing Transferability and Discriminability for Unsupervised Domain Adaptation
    Huang, Jingke
    Xiao, Ni
    Zhang, Lei
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2022, 35 (04) : 4911 - 4923
  • [2] Graph Adaptive Knowledge Transfer for Unsupervised Domain Adaptation
    Ding, Zhengming
    Li, Sheng
    Shao, Ming
    Fu, Yun
    COMPUTER VISION - ECCV 2018, PT II, 2018, 11206 : 36 - 52
  • [3] Enhancing transferability and discriminability simultaneously for unsupervised domain adaptation
    Li, Jingyao
    Lue, Shuai
    Zhu, Wenbo
    Li, Zhanshan
    KNOWLEDGE-BASED SYSTEMS, 2022, 247
  • [4] Uncertainty-Induced Transferability Representation for Source-Free Unsupervised Domain Adaptation
    Pei, Jiangbo
    Jiang, Zhuqing
    Men, Aidong
    Chen, Liang
    Liu, Yang
    Chen, Qingchao
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2023, 32 : 2033 - 2048
  • [5] PUnDA: Probabilistic Unsupervised Domain Adaptation for Knowledge Transfer Across Visual Categories
    Gholami, Behnam
    Rudovic, Ognjen
    Pavlovic, Vladimir
    2017 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV), 2017, : 3601 - 3610
  • [6] Transfer Domain Class Clustering for Unsupervised Domain Adaptation
    Fan, Yunxin
    Yan, Gang
    Li, Shuang
    Song, Shiji
    Wang, Wei
    Peng, Xinping
    PROCEEDINGS OF THE 3RD INTERNATIONAL CONFERENCE ON ELECTRICAL AND INFORMATION TECHNOLOGIES FOR RAIL TRANSPORTATION (EITRT) 2017: ELECTRICAL TRACTION, 2018, 482 : 827 - 835
  • [7] Knowledge distillation for BERT unsupervised domain adaptation
    Ryu, Minho
    Lee, Geonseok
    Lee, Kichun
    KNOWLEDGE AND INFORMATION SYSTEMS, 2022, 64 (11) : 3113 - 3128
  • [8] Geometric Knowledge Embedding for unsupervised domain adaptation
    Wu, Hanrui
    Yan, Yuguang
    Ye, Yuzhong
    Ng, Michael K.
    Wu, Qingyao
    KNOWLEDGE-BASED SYSTEMS, 2020, 191 (191)
  • [9] Knowledge distillation for BERT unsupervised domain adaptation
    Minho Ryu
    Geonseok Lee
    Kichun Lee
    Knowledge and Information Systems, 2022, 64 : 3113 - 3128
  • [10] Prior Knowledge Guided Unsupervised Domain Adaptation
    Sun, Tao
    Lu, Cheng
    Ling, Haibin
    COMPUTER VISION - ECCV 2022, PT XXXIII, 2022, 13693 : 639 - 655