Generatively Inferential Co-Training for Unsupervised Domain Adaptation

被引:15
|
作者
Qin, Can [1 ]
Wang, Lichen [1 ]
Zhang, Yulun [1 ]
Fu, Yun [1 ]
机构
[1] Northeastern Univ, Boston, MA 02115 USA
基金
美国国家科学基金会;
关键词
D O I
10.1109/ICCVW.2019.00135
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Deep Neural Networks (DNNs) have greatly boosted the performance on a wide range of computer vision and machine learning tasks. Despite such achievements, DNN is hungry for enormous high-quality (HQ) training data, which are expensive and time-consuming to collect. To tackle this challenge, domain adaptation (DA) could help learning a model by leveraging the knowledge of low quality (LQ) data (i.e., source domain), while generalizing well on label-scarce HQ data (i.e., target domain). However, existing methods have two problems. First, they mainly focus on the high-level feature alignment while neglecting low-level mismatch. Second, there exists a class-conditional distribution shift even features being well aligned. To solve these problems, we propose a novel Generatively Inferential Co-Training (GICT) frameworkfor Unsupervised Domain Adaptation (UDA). GICT is based on cross-domain feature generation and a specifically designed co-training strategy. Feature generation adapts the representation at low level by translating images across domains. Co-training is employed to bridge conditional distribution shift by assigning high-confident pseudo labels on target domain inferred from two distinct classifiers. Extensive experiments on multiple tasks including image classification and semantic segmentation demonstrate the effectiveness of GICT approach'.
引用
收藏
页码:1055 / 1064
页数:10
相关论文
共 50 条
  • [1] Co-Training for Unsupervised Domain Adaptation of Semantic Segmentation Models
    Gomez, Jose L.
    Villalonga, Gabriel
    Lopez, Antonio M.
    SENSORS, 2023, 23 (02)
  • [2] Unsupervised domain adaptation for object detection through mixed-domain and co-training learning
    Wei, Xing
    Qin, Xiongbo
    Zhao, Chong
    Qiao, Xuanyuan
    Lu, Yang
    MULTIMEDIA TOOLS AND APPLICATIONS, 2024, 83 (09) : 25213 - 25229
  • [3] Unsupervised domain adaptation for object detection through mixed-domain and co-training learning
    Xing Wei
    Xiongbo Qin
    Chong Zhao
    Xuanyuan Qiao
    Yang Lu
    Multimedia Tools and Applications, 2024, 83 (9) : 25213 - 25229
  • [4] UNSUPERVISED CHANNEL ADAPTATION FOR LANGUAGE IDENTIFICATION USING CO-TRAINING
    Ganapathy, Sriram
    Omar, Mohamed
    Pelecanos, Jason
    2013 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2013, : 6857 - 6861
  • [5] A Co-Training Framework for Heterogeneous Heuristic Domain Adaptation
    Yang, Cuie
    Xue, Bing
    Tan, Kay Chen
    Zhang, Mengjie
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (05) : 6863 - 6877
  • [6] Symmetric co-training based unsupervised domain adaptation approach for intelligent fault diagnosis of rolling bearing
    Yu, Kun
    Han, Hongzheng
    Fu, Qiang
    Ma, Hui
    Zeng, Jin
    MEASUREMENT SCIENCE AND TECHNOLOGY, 2020, 31 (11)
  • [7] Unsupervised Conversation Disentanglement through Co-Training
    Liu, Hui
    Shi, Zhan
    Zhu, Xiaodan
    2021 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2021), 2021, : 2345 - 2356
  • [8] Deep Co-Training with Task Decomposition for Semi-Supervised Domain Adaptation
    Yang, Luyu
    Wang, Yan
    Gao, Mingfei
    Shrivastava, Abhinav
    Weinberger, Kilian Q.
    Chao, Wei-Lun
    Lim, Ser-Nam
    2021 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2021), 2021, : 8886 - 8896
  • [9] Co-training Disentangled Domain Adaptation Network for Leveraging Popularity Bias in Recommenders
    Chen, Zhihong
    Wu, Jiawei
    Li, Chenliang
    Chen, Jingxu
    Xiao, Rong
    Zhao, Binqiang
    PROCEEDINGS OF THE 45TH INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL (SIGIR '22), 2022, : 60 - 69
  • [10] Co-training an Unsupervised Constituency Parser withWeak Supervision
    Maveli, Nickil
    Cohen, Shay B.
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), 2022, : 1274 - 1291