Gradual Domain Adaptation via Normalizing Flows

被引:0
|
作者
Sagawa, Shogo [1 ,2 ]
Hino, Hideitsu [3 ,4 ]
机构
[1] Grad Univ Adv Studies, Dept Stat Sci, Hayama, Kanagawa 2400193, Japan
[2] Konica Minolta, Hachioji, Tokyo 1928505, Japan
[3] Inst Stat Math, Dept Adv Data Sci, Tachikawa, Tokyo 1908562, Japan
[4] RIKEN, Ctr Adv Intelligence Project, Chuo Ku, Tokyo 1030027, Japan
关键词
ENTROPY ESTIMATORS;
D O I
10.1162/neco_a_01734
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Standard domain adaptation methods do not work well when a large gap exists between the source and target domains. Gradual domain adaptation is one of the approaches used to address the problem. It involves leveraging the intermediate domain, which gradually shifts from the source domain to the target domain. In previous work, it is assumed that the number of intermediate domains is large and the distance between adjacent domains is small; hence, the gradual domain adaptation algorithm, involving self-training with unlabeled data sets, is applicable. In practice, however, gradual self-training will fail because the number of intermediate domains is limited and the distance between adjacent domains is large. We propose the use of normalizing flows to deal with this problem while maintaining the framework of unsupervised domain adaptation. The proposed method learns a transformation from the distribution of the target domains to the gaussian mixture distribution via the source domain. We evaluate our proposed method by experiments using real-world data sets and confirm that it mitigates the problem we have explained and improves the classification performance.
引用
收藏
页码:522 / 568
页数:47
相关论文
共 50 条
  • [1] Normalizing flows for domain adaptation when identifying Λhyperon events
    Kelleher, R.
    Vossen, A.
    JOURNAL OF INSTRUMENTATION, 2024, 19 (06):
  • [2] MapFlow: latent transition via normalizing flow for unsupervised domain adaptation
    Askari, Hossein
    Latif, Yasir
    Sun, Hongfu
    MACHINE LEARNING, 2023, 112 (08) : 2953 - 2974
  • [3] MapFlow: latent transition via normalizing flow for unsupervised domain adaptation
    Hossein Askari
    Yasir Latif
    Hongfu Sun
    Machine Learning, 2023, 112 : 2953 - 2974
  • [4] Gradual Domain Adaptation: Theory and Algorithms
    He, Yifei
    Wang, Haoxiang
    Li, Bo
    Zhao, Han
    JOURNAL OF MACHINE LEARNING RESEARCH, 2024, 25 : 1 - 40
  • [5] Curriculum Reinforcement Learning using Optimal Transport via Gradual Domain Adaptation
    Huang, Peide
    Xu, Mengdi
    Zhu, Jiacheng
    Shi, Laixi
    Fang, Fei
    Zhao, Ding
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [6] Active Gradual Domain Adaptation: Dataset and Approach
    Zhou, Shiji
    Wang, Lianzhe
    Zhang, Shanghang
    Wang, Zhi
    Zhu, Wenwu
    IEEE TRANSACTIONS ON MULTIMEDIA, 2022, 24 : 1210 - 1220
  • [7] Imbalanced Open Set Domain Adaptation via Moving-Threshold Estimation and Gradual Alignment
    Ru, Jinghan
    Tian, Jun
    Xiao, Chengwei
    Li, Jingjing
    Shen, Heng Tao
    IEEE TRANSACTIONS ON MULTIMEDIA, 2024, 26 : 2504 - 2514
  • [8] Flexible Approximate Inference via Stratified Normalizing Flows
    Cundy, Chris
    Ermon, Stefano
    CONFERENCE ON UNCERTAINTY IN ARTIFICIAL INTELLIGENCE (UAI 2020), 2020, 124 : 1288 - 1297
  • [9] Randomized Value Functions via Multiplicative Normalizing Flows
    Touati, Ahmed
    Satija, Harsh
    Romoff, Joshua
    Pineau, Joelle
    Vincent, Pascal
    35TH UNCERTAINTY IN ARTIFICIAL INTELLIGENCE CONFERENCE (UAI 2019), 2020, 115 : 422 - 432
  • [10] Understanding Self-Training for Gradual Domain Adaptation
    Kumar, Ananya
    Ma, Tengyu
    Liang, Percy
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 119, 2020, 119