UFDA: Universal Federated Domain Adaptation with Practical Assumptions

被引:0
|
作者
Liu, Xinhui [1 ,2 ]
Chen, Zhenghao [2 ]
Zhou, Luping [2 ]
Xu, Dong [3 ]
Xi, Wei [1 ]
Bai, Gairui [1 ]
Zhao, Yihan [1 ]
Zhao, Jizhong [1 ]
机构
[1] Xi An Jiao Tong Univ, Sch Comp Sci & Technol, Xian, Peoples R China
[2] Univ Sydney, Sch Elect & Comp Engn, Sydney, NSW, Australia
[3] Univ Hong Kong, Dept Comp Sci, Hong Kong, Peoples R China
基金
国家重点研发计划;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Conventional Federated Domain Adaptation (FDA) approaches usually demand an abundance of assumptions, which makes them significantly less feasible for real-world situations and introduces security hazards. This paper relaxes the assumptions from previous FDAs and studies a more practical scenario named Universal Federated Domain Adaptation (UFDA). It only requires the black-box model and the label set information of each source domain, while the label sets of different source domains could be inconsistent, and the target-domain label set is totally blind. Towards a more effective solution for our newly proposed UFDA scenario, we propose a corresponding methodology called Hot-Learning with Contrastive Label Disambiguation (HCLD). It particularly tackles UFDA's domain shifts and category gaps problems by using one-hot outputs from the black-box models of various source domains. Moreover, to better distinguish the shared and unknown classes, we further present a cluster-level strategy named Mutual-Voting Decision (MVD) to extract robust consensus knowledge across peer classes from both source and target domains. Extensive experiments on three benchmark datasets demonstrate that our method achieves comparable performance for our UFDA scenario with much fewer assumptions, compared to previous methodologies with comprehensive additional assumptions.
引用
收藏
页码:14026 / 14034
页数:9
相关论文
共 50 条
  • [1] Universal Domain Adaptation
    You, Kaichao
    Long, Mingsheng
    Cao, Zhangjie
    Wang, Jianmin
    Jordan, Michael I.
    2019 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2019), 2019, : 2715 - 2724
  • [2] Active Universal Domain Adaptation
    Ma, Xinhong
    Gao, Junyu
    Xu, Changsheng
    2021 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2021), 2021, : 8948 - 8957
  • [3] Survey of Universal Domain Adaptation
    He Q.
    Deng M.
    Jisuanji Yanjiu yu Fazhan/Computer Research and Development, 2024, 61 (01): : 120 - 144
  • [4] Generalized Universal Domain Adaptation
    Su, Wan
    Han, Zhongyi
    Liu, Xingbo
    Yin, Yilong
    KNOWLEDGE-BASED SYSTEMS, 2024, 302
  • [5] Domain Consensus Clustering for Universal Domain Adaptation
    Li, Guangrui
    Kang, Guoliang
    Zhu, Yi
    Wei, Yunchao
    Yang, Yi
    2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR 2021, 2021, : 9752 - 9761
  • [6] Federated Multi-Target Domain Adaptation
    Yao, Chun-Han
    Gong, Boqing
    Qi, Hang
    Cui, Yin
    Zhu, Yukun
    Yang, Ming-Hsuan
    2022 IEEE WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION (WACV 2022), 2022, : 1081 - 1090
  • [7] Feature Diversification and Adaptation for Federated Domain Generalization
    Yang, Seunghan
    Choi, Seokeon
    Park, Hyunsin
    Choi, Sungha
    Chang, Simyung
    Yuri, Sungrack
    COMPUTER VISION - ECCV 2024, PT LXXII, 2025, 15130 : 52 - 70
  • [8] Communicational and Computational Efficient Federated Domain Adaptation
    Kang, Hua
    Li, Zhiyang
    Zhang, Qian
    IEEE TRANSACTIONS ON PARALLEL AND DISTRIBUTED SYSTEMS, 2022, 33 (12) : 3678 - 3689
  • [9] FedDAD: Federated Domain Adaptation for Object Detection
    Lu, Peggy Joy
    Jui, Chia-Yung
    Chuang, Jen-Hui
    IEEE ACCESS, 2023, 11 : 51320 - 51330
  • [10] Sample Selection for Universal Domain Adaptation
    Lifshitz, Omri
    Wolf, Lior
    THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 8592 - 8600