Gradual Domain Adaptation via Normalizing Flows

被引:0
|
作者
Sagawa, Shogo [1 ,2 ]
Hino, Hideitsu [3 ,4 ]
机构
[1] Grad Univ Adv Studies, Dept Stat Sci, Hayama, Kanagawa 2400193, Japan
[2] Konica Minolta, Hachioji, Tokyo 1928505, Japan
[3] Inst Stat Math, Dept Adv Data Sci, Tachikawa, Tokyo 1908562, Japan
[4] RIKEN, Ctr Adv Intelligence Project, Chuo Ku, Tokyo 1030027, Japan
关键词
ENTROPY ESTIMATORS;
D O I
10.1162/neco_a_01734
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Standard domain adaptation methods do not work well when a large gap exists between the source and target domains. Gradual domain adaptation is one of the approaches used to address the problem. It involves leveraging the intermediate domain, which gradually shifts from the source domain to the target domain. In previous work, it is assumed that the number of intermediate domains is large and the distance between adjacent domains is small; hence, the gradual domain adaptation algorithm, involving self-training with unlabeled data sets, is applicable. In practice, however, gradual self-training will fail because the number of intermediate domains is limited and the distance between adjacent domains is large. We propose the use of normalizing flows to deal with this problem while maintaining the framework of unsupervised domain adaptation. The proposed method learns a transformation from the distribution of the target domains to the gaussian mixture distribution via the source domain. We evaluate our proposed method by experiments using real-world data sets and confirm that it mitigates the problem we have explained and improves the classification performance.
引用
收藏
页码:522 / 568
页数:47
相关论文
共 50 条
  • [21] Graph Normalizing Flows
    Liu, Jenny
    Kumar, Aviral
    Ba, Jimmy
    Kiros, Jamie
    Swersky, Kevin
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [22] Stochastic Normalizing Flows
    Wu, Hao
    Koehler, Jonas
    Noe, Frank
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
  • [23] Leveraging exploration in off-policy algorithms via normalizing flows
    Mazoure, Bogdan
    Doan, Thang
    Durand, Audrey
    Pineau, Joelle
    Hjelm, R. Devon
    CONFERENCE ON ROBOT LEARNING, VOL 100, 2019, 100
  • [24] ClimAlign: Unsupervised statistical downscaling of climate variables via normalizing flows
    Groenke, Brian
    Madaus, Luke
    Monteleoni, Claire
    PROCEEDINGS OF 2020 10TH INTERNATIONAL CONFERENCE ON CLIMATE INFORMATICS (CI2020), 2020, : 60 - 66
  • [25] NF-iSAM: Incremental Smoothing and Mapping via Normalizing Flows
    Huang, Qiangqiang
    Pu, Can
    Fourie, Dehann
    Khosoussi, Kasra
    How, Jonathan P.
    Leonard, John J.
    2021 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA 2021), 2021, : 1095 - 1102
  • [26] Understanding Gradual Domain Adaptation: Improved Analysis, Optimal Path and Beyond
    Wang, Haoxiang
    Li, Bo
    Zhao, Han
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,
  • [27] Riemannian Continuous Normalizing Flows
    Mathieu, Emile
    Nickel, Maximilian
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
  • [28] Event generation with normalizing flows
    Gao, Christina
    Hoche, Stefan
    Isaacson, Joshua
    Krause, Claudius
    Schulz, Holger
    PHYSICAL REVIEW D, 2020, 101 (07)
  • [29] NORMALIZING FLOWS FOR INTELLIGENT MANUFACTURING
    Russell, Matthew
    Wang, Peng
    PROCEEDINGS OF ASME 2023 18TH INTERNATIONAL MANUFACTURING SCIENCE AND ENGINEERING CONFERENCE, MSEC2023, VOL 2, 2023,
  • [30] Densely connected normalizing flows
    Grcic, Matej
    Grubisic, Ivan
    Segvic, Sinisa
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34