Gradual Domain Adaptation via Normalizing Flows

被引:0
|
作者
Sagawa, Shogo [1 ,2 ]
Hino, Hideitsu [3 ,4 ]
机构
[1] Grad Univ Adv Studies, Dept Stat Sci, Hayama, Kanagawa 2400193, Japan
[2] Konica Minolta, Hachioji, Tokyo 1928505, Japan
[3] Inst Stat Math, Dept Adv Data Sci, Tachikawa, Tokyo 1908562, Japan
[4] RIKEN, Ctr Adv Intelligence Project, Chuo Ku, Tokyo 1030027, Japan
关键词
ENTROPY ESTIMATORS;
D O I
10.1162/neco_a_01734
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Standard domain adaptation methods do not work well when a large gap exists between the source and target domains. Gradual domain adaptation is one of the approaches used to address the problem. It involves leveraging the intermediate domain, which gradually shifts from the source domain to the target domain. In previous work, it is assumed that the number of intermediate domains is large and the distance between adjacent domains is small; hence, the gradual domain adaptation algorithm, involving self-training with unlabeled data sets, is applicable. In practice, however, gradual self-training will fail because the number of intermediate domains is limited and the distance between adjacent domains is large. We propose the use of normalizing flows to deal with this problem while maintaining the framework of unsupervised domain adaptation. The proposed method learns a transformation from the distribution of the target domains to the gaussian mixture distribution via the source domain. We evaluate our proposed method by experiments using real-world data sets and confirm that it mitigates the problem we have explained and improves the classification performance.
引用
收藏
页码:522 / 568
页数:47
相关论文
共 50 条
  • [31] Gradient Boosted Normalizing Flows
    Giaquinto, Robert
    Banerjee, Arindam
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
  • [32] Principled Interpolation in Normalizing Flows
    Fadel, Samuel G.
    Mair, Sebastian
    Torres, Ricardo da S.
    Brefeld, Ulf
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, ECML PKDD 2021: RESEARCH TRACK, PT II, 2021, 12976 : 116 - 131
  • [33] Normalizing Flows for LHC Theory
    Butter, Anja
    20TH INTERNATIONAL WORKSHOP ON ADVANCED COMPUTING AND ANALYSIS TECHNIQUES IN PHYSICS RESEARCH, 2023, 2438
  • [34] Sliced Iterative Normalizing Flows
    Dai, Biwei
    Seljak, Uros
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139
  • [35] GRADIENT ESTIMATORS FOR NORMALIZING FLOWS
    Bialas, Piotr
    Korcyl, Piotr
    Stebel, Tomasz
    ACTA PHYSICA POLONICA B, 2024, 55 (03):
  • [36] Variational Inference with Normalizing Flows
    Rezende, Danilo Jimenez
    Mohamed, Shakir
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 37, 2015, 37 : 1530 - 1538
  • [37] Normalizing flows for atomic solids
    Wirnsberger, Peter
    Papamakarios, George
    Ibarz, Borja
    Racaniere, Sebastien
    Ballard, Andrew J.
    Pritzel, Alexander
    Blundell, Charles
    MACHINE LEARNING-SCIENCE AND TECHNOLOGY, 2022, 3 (02):
  • [38] AlignFlow: Cycle Consistent Learning from Multiple Domains via Normalizing Flows
    Grover, Aditya
    Chute, Christopher
    Shu, Rui
    Cao, Zhangjie
    Ermon, Stefano
    THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 4028 - 4035
  • [39] Detecting Anomalies in Small Unmanned Aerial Systems via Graphical Normalizing Flows
    Ma, Yihong
    Islam, Md Nafee Al
    Cleland-Huang, Jane
    Chawla, Nitesh V.
    IEEE INTELLIGENT SYSTEMS, 2023, 38 (02) : 46 - 54
  • [40] Step-by-step gradual domain adaptation for rotating machinery fault diagnosis
    Sun, Haoran
    Zeng, Jia
    Wang, Yi
    Ruan, Hulin
    Meng, Lihua
    MEASUREMENT SCIENCE AND TECHNOLOGY, 2022, 33 (07)