Multi-Source Collaborative Contrastive Learning for Decentralized Domain Adaptation

被引:13
|
作者
Wei, Yikang [1 ]
Yang, Liu [1 ]
Han, Yahong [1 ]
Hu, Qinghua [1 ]
机构
[1] Tianjin Univ, Coll Intelligence & Comp, Tianjin Key Lab Machine Learning, Tianjin 300350, Peoples R China
关键词
Adaptation models; Feature extraction; Data models; Collaboration; Data mining; Training; Bridges; Multi-source domain adaptation; data decentralization; contrastive learning; UNSUPERVISED DOMAIN;
D O I
10.1109/TCSVT.2022.3219893
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Unsupervised multi-source domain adaptation aims to obtain a model working well on the unlabeled target domain by reducing the domain gap between the labeled source domains and the unlabeled target domain. Considering the data privacy and storage cost, data from multiple source domains and target domain are isolated and decentralized. This data decentralization scenario brings the difficulty of domain alignment for reducing the domain gap between the decentralized source domains and target domain, respectively. For conducting domain alignment under the data decentralization scenario, we propose Multi-source Collaborative Contrastive learning for decentralized Domain Adaptation (MCC-DA). The models from other domains are used as the bridge to reduce the domain gap. On the source domains and target domain, we penalize the inconsistency of data features extracted from the source domain models and target domain model by contrastive alignment. With the collaboration of source domain models and target domain model, the domain gap between decentralized source domains and target domain is reduced without accessing the data from other domains. The experiment results on multiple benchmarks indicate that our method can reduce the domain gap effectively and outperform the state-of-the-art methods significantly.
引用
收藏
页码:2202 / 2216
页数:15
相关论文
共 50 条
  • [31] Dynamic Transfer for Multi-Source Domain Adaptation
    Li, Yunsheng
    Yuan, Lu
    Chen, Yinpeng
    Wang, Pei
    Vasconcelos, Nuno
    2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR 2021, 2021, : 10993 - 11002
  • [32] Coupled Training for Multi-Source Domain Adaptation
    Amosy, Ohad
    Chechik, Gal
    2022 IEEE WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION (WACV 2022), 2022, : 1071 - 1080
  • [33] Multi-Source Domain Adaptation for Object Detection
    Yao, Xingxu
    Zhao, Sicheng
    Xu, Pengfei
    Yang, Jufeng
    2021 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2021), 2021, : 3253 - 3262
  • [34] On the analysis of adaptability in multi-source domain adaptation
    Ievgen Redko
    Amaury Habrard
    Marc Sebban
    Machine Learning, 2019, 108 : 1635 - 1652
  • [35] Multi-Source Domain Adaptation with Sinkhorn Barycenter
    Komatsu, Tatsuya
    Matsui, Tomoko
    Gao, Junbin
    29TH EUROPEAN SIGNAL PROCESSING CONFERENCE (EUSIPCO 2021), 2021, : 1371 - 1375
  • [36] Graphical Modeling for Multi-Source Domain Adaptation
    Xu, Minghao
    Wang, Hang
    Ni, Bingbing
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2024, 46 (03) : 1727 - 1741
  • [37] Multi-source Domain Adaptation for Face Recognition
    Yi, Haiyang
    Xu, Zhi
    Wen, Yimin
    Fan, Zhigang
    2018 24TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2018, : 1349 - 1354
  • [38] Multi-Source Attention for Unsupervised Domain Adaptation
    Cui, Xia
    Bollegala, Danushka
    1ST CONFERENCE OF THE ASIA-PACIFIC CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS AND THE 10TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (AACL-IJCNLP 2020), 2020, : 873 - 883
  • [39] Multi-Source Domain Adaptation with Mixture of Experts
    Guo, Jiang
    Shah, Darsh J.
    Barzilay, Regina
    2018 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2018), 2018, : 4694 - 4703
  • [40] Multi-source domain adaptation with joint learning for cross-domain sentiment classification
    Zhao, Chuanjun
    Wang, Suge
    Li, Deyu
    KNOWLEDGE-BASED SYSTEMS, 2020, 191