Beyond Sharing Weights for Deep Domain Adaptation

被引:272
|
作者
Rozantsev, Artem [1 ]
Salzmann, Mathieu [1 ]
Fua, Pascal [1 ]
机构
[1] Ecole Polytech Fed Lausanne, Comp Vis Lab, CH-1015 Lausanne, Switzerland
关键词
Domain adaptation; deep learning; RECOGNITION; FEATURES;
D O I
10.1109/TPAMI.2018.2814042
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The performance of a classifier trained on data coming from a specific domain typically degrades when applied to a related but different one. While annotating many samples from the new domain would address this issue, it is often too expensive or impractical. Domain Adaptation has therefore emerged as a solution to this problem; It leverages annotated data from a source domain, in which it is abundant, to train a classifier to operate in a target domain, in which it is either sparse or even lacking altogether. In this context, the recent trend consists of learning deep architectures whose weights are shared for both domains, which essentially amounts to learning domain invariant features. Here, we show that it is more effective to explicitly model the shift from one domain to the other. To this end, we introduce a two-stream architecture, where one operates in the source domain and the other in the target domain. In contrast to other approaches, the weights in corresponding layers are related but not shared. We demonstrate that this both yields higher accuracy than state-of-the-art methods on several object recognition and detection tasks and consistently outperforms networks with shared weights in both supervised and unsupervised settings.
引用
收藏
页码:801 / 814
页数:14
相关论文
共 50 条
  • [1] Deep Visual Domain Adaptation
    Csurka, Gabriela
    2020 22ND INTERNATIONAL SYMPOSIUM ON SYMBOLIC AND NUMERIC ALGORITHMS FOR SCIENTIFIC COMPUTING (SYNASC 2020), 2020, : 1 - 8
  • [2] Deep Discriminative Domain Adaptation
    Zhang, Changchun
    Zhao, Qingjie
    INFORMATION SCIENCES, 2021, 575 : 599 - 610
  • [3] Robust Domain Adaptation: Representations, Weights and Inductive Bias
    Bouvier, Victor
    Very, Philippe
    Chastagnol, Clement
    Tami, Myriam
    Hudelot, Celine
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, ECML PKDD 2020, PT I, 2021, 12457 : 353 - 377
  • [4] DEEP CLUSTERING FOR DOMAIN ADAPTATION
    Gao, Boyan
    Yang, Yongxin
    Gouk, Henry
    Hospedales, Timothy M.
    2020 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, 2020, : 4247 - 4251
  • [5] Fisher Deep Domain Adaptation
    Zhang, Yinghua
    Zhang, Yu
    Wei, Ying
    Bai, Kun
    Song, Yangqiu
    Yang, Qiang
    PROCEEDINGS OF THE 2020 SIAM INTERNATIONAL CONFERENCE ON DATA MINING (SDM), 2020, : 469 - 477
  • [6] Deep Local Descriptors with Domain Adaptation
    Qiu, Shuwen
    Deng, Weihong
    PATTERN RECOGNITION AND COMPUTER VISION, PT II, 2018, 11257 : 344 - 355
  • [7] Domain Adaptation for Deep Entity Resolution
    Tu, Jianhong
    Fan, Ju
    Tang, Nan
    Wang, Peng
    Chai, Chengliang
    Li, Guoliang
    Fan, Ruixue
    Du, Xiaoyong
    PROCEEDINGS OF THE 2022 INTERNATIONAL CONFERENCE ON MANAGEMENT OF DATA (SIGMOD '22), 2022, : 443 - 457
  • [8] Towards Explainable Deep Domain Adaptation
    Bobek, Szymon
    Nowaczyk, Slawomir
    Pashami, Sepideh
    Taghiyarrenani, Zahra
    Nalepa, Grzegorz J.
    ARTIFICIAL INTELLIGENCE-ECAI 2023 INTERNATIONAL WORKSHOPS, PT 1, XAI3, TACTIFUL, XI-ML, SEDAMI, RAAIT, AI4S, HYDRA, AI4AI, 2023, 2024, 1947 : 101 - 113
  • [9] A Survey of Unsupervised Deep Domain Adaptation
    Wilson, Garrett
    Cook, Diane J.
    ACM TRANSACTIONS ON INTELLIGENT SYSTEMS AND TECHNOLOGY, 2020, 11 (05)
  • [10] On the inductive biases of deep domain adaptation
    Siry, Rodrigue
    Hemadou, Louis
    Simon, Loic
    Jurie, Frederic
    COMPUTER VISION AND IMAGE UNDERSTANDING, 2023, 233