Divergence Optimization for Noisy Universal Domain Adaptation

被引:22
|
作者
Yu, Qing [1 ,2 ]
Hashimoto, Atsushi [2 ]
Ushiku, Yoshitaka [2 ]
机构
[1] Univ Tokyo, Tokyo, Japan
[2] OMRON SINIC X Corp, Tokyo, Japan
关键词
D O I
10.1109/CVPR46437.2021.00254
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Universal domain adaptation (UniDA) has been proposed to transfer knowledge learned from a label-rich source domain to a label-scarce target domain without any constraints on the label sets. In practice, however, it is difficult to obtain a large amount of perfectly clean labeled data in a source domain with limited resources. Existing UniDA methods rely on source samples with correct annotations, which greatly limits their application in the real world. Hence, we consider a new realistic setting called Noisy UniDA, in which classifiers are trained with noisy labeled data from the source domain and unlabeled data with an unknown class distribution from the target domain. This paper introduces a two-head convolutional neural network framework to solve all problems simultaneously. Our network consists of one common feature generator and two classifiers with different decision boundaries. By optimizing the divergence between the two classifiers' outputs, we can detect noisy source samples, find "unknown" classes in the target domain, and align the distribution of the source and target domains. In an extensive evaluation of different domain adaptation settings, the proposed method outperformed existing methods by a large margin in most settings.
引用
收藏
页码:2515 / 2524
页数:10
相关论文
共 50 条
  • [41] Curriculum adaptation method based on graph neural networks for universal domain adaptation
    Fan, Cangning
    Liu, Peng
    Zhao, Wei
    EXPERT SYSTEMS WITH APPLICATIONS, 2024, 255
  • [42] Robust Semi-supervised Domain Adaptation against Noisy Labels
    Qin, Can
    Wang, Yizhou
    Fu, Yun
    PROCEEDINGS OF THE 31ST ACM INTERNATIONAL CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, CIKM 2022, 2022, : 4409 - 4413
  • [43] Noisy-Aware Unsupervised Domain Adaptation for Scene Text Recognition
    Liu, Xiao-Qian
    Zhang, Peng-Fei
    Luo, Xin
    Huang, Zi
    Xu, Xin-Shun
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2024, 33 : 6550 - 6563
  • [44] Towards Accurate and Robust Domain Adaptation Under Multiple Noisy Environments
    Han, Zhongyi
    Gui, Xian-Jin
    Sun, Haoliang
    Yin, Yilong
    Li, Shuo
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2023, 45 (05) : 6460 - 6479
  • [45] Domain consensual contrastive learning for few-shot universal domain adaptation
    Liao, Haojin
    Wang, Qiang
    Zhao, Sicheng
    Xing, Tengfei
    Hu, Runbo
    APPLIED INTELLIGENCE, 2023, 53 (22) : 27191 - 27206
  • [46] Domain consensual contrastive learning for few-shot universal domain adaptation
    Haojin Liao
    Qiang Wang
    Sicheng Zhao
    Tengfei Xing
    Runbo Hu
    Applied Intelligence, 2023, 53 : 27191 - 27206
  • [47] Regularized Hypothesis-Induced Wasserstein Divergence for unsupervised domain adaptation
    Si, Lingyu
    Dong, Hongwei
    Qiang, Wenwen
    Zheng, Changwen
    Yu, Junzhi
    Sun, Fuchun
    KNOWLEDGE-BASED SYSTEMS, 2024, 283
  • [48] Universal multi-Source domain adaptation for image classification
    Yin, Yueming
    Yang, Zhen
    Hu, Haifeng
    Wu, Xiaofu
    PATTERN RECOGNITION, 2022, 121
  • [49] Universal Domain Adaptation for Remote Sensing Image Scene Classification
    Xu, Qingsong
    Shi, Yilei
    Yuan, Xin
    Zhu, Xiao Xiang
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2023, 61
  • [50] Synthetic Source Universal Domain Adaptation through Contrastive Learning
    Cho, Jungchan
    SENSORS, 2021, 21 (22)