Gradual Domain Adaptation via Normalizing Flows

被引:0
|
作者
Sagawa, Shogo [1 ,2 ]
Hino, Hideitsu [3 ,4 ]
机构
[1] Grad Univ Adv Studies, Dept Stat Sci, Hayama, Kanagawa 2400193, Japan
[2] Konica Minolta, Hachioji, Tokyo 1928505, Japan
[3] Inst Stat Math, Dept Adv Data Sci, Tachikawa, Tokyo 1908562, Japan
[4] RIKEN, Ctr Adv Intelligence Project, Chuo Ku, Tokyo 1030027, Japan
关键词
ENTROPY ESTIMATORS;
D O I
10.1162/neco_a_01734
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Standard domain adaptation methods do not work well when a large gap exists between the source and target domains. Gradual domain adaptation is one of the approaches used to address the problem. It involves leveraging the intermediate domain, which gradually shifts from the source domain to the target domain. In previous work, it is assumed that the number of intermediate domains is large and the distance between adjacent domains is small; hence, the gradual domain adaptation algorithm, involving self-training with unlabeled data sets, is applicable. In practice, however, gradual self-training will fail because the number of intermediate domains is limited and the distance between adjacent domains is large. We propose the use of normalizing flows to deal with this problem while maintaining the framework of unsupervised domain adaptation. The proposed method learns a transformation from the distribution of the target domains to the gaussian mixture distribution via the source domain. We evaluate our proposed method by experiments using real-world data sets and confirm that it mitigates the problem we have explained and improves the classification performance.
引用
收藏
页码:522 / 568
页数:47
相关论文
共 50 条
  • [41] Gradual Domain Adaptation for Segmenting Whole Slide Images Showing Pathological Variability
    Gadermayr, Michael
    Eschweiler, Dennis
    Klinkhammer, Barbara Mara
    Boor, Peter
    Merhof, Dorit
    IMAGE AND SIGNAL PROCESSING (ICISP 2018), 2018, 10884 : 461 - 469
  • [42] GENERALIZATION BOUNDS FOR DOMAIN ADAPTATION VIA DOMAIN TRANSFORMATIONS
    Vural, Elif
    2018 IEEE 28TH INTERNATIONAL WORKSHOP ON MACHINE LEARNING FOR SIGNAL PROCESSING (MLSP), 2018,
  • [43] A Noising-Denoising Framework for Point Cloud Upsampling via Normalizing Flows
    Hu, Xin
    Wei, Xin
    Sun, Jian
    PATTERN RECOGNITION, 2023, 140
  • [44] Unsupervised video anomaly detection via normalizing flows with implicit latent features
    Cho, MyeongAh
    Kim, Taeoh
    Kim, Woo Jin
    Cho, Suhwan
    Lee, Sangyoun
    PATTERN RECOGNITION, 2022, 129
  • [45] Domain Adaptation via Prompt Learning
    Ge, Chunjiang
    Huang, Rui
    Xie, Mixue
    Lai, Zihang
    Song, Shiji
    Li, Shuang
    Huang, Gao
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2025, 36 (01) : 1160 - 1170
  • [46] Unsupervised Domain Adaptation via Deep Conditional Adaptation Network
    Ge, Pengfei
    Ren, Chuan-Xian
    Xu, Xiao-Lin
    Yan, Hong
    PATTERN RECOGNITION, 2023, 134
  • [47] Multiscale Super-Resolution Remote Imaging via Deep Conditional Normalizing Flows
    Heintz, Aneesh M.
    Peck, Mason
    Mackey, Ian
    JOURNAL OF AEROSPACE INFORMATION SYSTEMS, 2023, 20 (08): : 473 - 488
  • [48] Learning and Generalization in Overparameterized Normalizing Flows
    Shah, Kulin
    Deshpande, Amit
    Goyal, Navin
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 151, 2022, 151
  • [49] Canonical normalizing flows for manifold learning
    Flouris, Kyriakos
    Konukoglu, Ender
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [50] Latent Normalizing Flows for Discrete Sequences
    Ziegler, Zachary M.
    Rush, Alexander M.
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 97, 2019, 97