MixStyle Neural Networks for Domain Generalization and Adaptation

被引:7
|
作者
Zhou, Kaiyang [1 ]
Yang, Yongxin [2 ]
Qiao, Yu [3 ,4 ]
Xiang, Tao [5 ]
机构
[1] Hong Kong Baptist Univ, Hong Kong, Peoples R China
[2] Queen Mary Univ London, London, England
[3] Chinese Acad Sci, Shanghai AI Lab, Shenzhen, Peoples R China
[4] Chinese Acad Sci, Shenzhen Inst Adv Technol, Shenzhen, Peoples R China
[5] Univ Surrey, Guildford, England
关键词
D O I
10.1007/s11263-023-01913-8
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Neural networks do not generalize well to unseen data with domain shifts-a longstanding problem in machine learning and AI. To overcome the problem, we propose MixStyle, a simple plug-and-play, parameter-free module that can improve domain generalization performance without the need to collect more data or increase model capacity. The design of MixStyle is simple: it mixes the feature statistics of two random instances in a single forward pass during training. The idea is grounded by the finding from recent style transfer research that feature statistics capture image style information, which essentially defines visual domains. Therefore, mixing feature statistics can be seen as an efficient way to synthesize new domains in the feature space, thus achieving data augmentation. MixStyle is easy to implement with a few lines of code, does not require modification to training objectives, and can fit a variety of learning paradigms including supervised domain generalization, semi-supervised domain generalization, and unsupervised domain adaptation. Our experiments show that MixStyle can significantly boost out-of-distribution generalization performance across a wide range of tasks including image recognition, instance retrieval and reinforcement learning. The source code is released at https://github.com/KaiyangZhou/mixstyle-release.
引用
收藏
页码:822 / 836
页数:15
相关论文
共 50 条
  • [41] Domain Adaptation and Generalization: A Low-Complexity Approach
    Niemeijer, Joshua
    Schaefer, Joerg P.
    CONFERENCE ON ROBOT LEARNING, VOL 205, 2022, 205 : 1081 - 1091
  • [42] Certifying Better Robust Generalization for Unsupervised Domain Adaptation
    Gao, Zhicliang
    Zhang, Shufei
    Huang, Kaizhu
    Wang, Qiufeng
    Zhang, Rui
    Zhong, Chaoliang
    PROCEEDINGS OF THE 30TH ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA, MM 2022, 2022, : 2399 - 2410
  • [43] Collaborative Optimization and Aggregation for Decentralized Domain Generalization and Adaptation
    Wu, Guile
    Gong, Shaogang
    2021 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2021), 2021, : 6464 - 6473
  • [44] Wavelet Neural Networks Generalization Improvement
    Skhiri, Mohamed Zine El Abidine
    Chtourou, Mohamed
    2013 10TH INTERNATIONAL MULTI-CONFERENCE ON SYSTEMS, SIGNALS & DEVICES (SSD), 2013,
  • [45] Improved Test-Time Adaptation for Domain Generalization
    Chen, Liang
    Zhang, Yong
    Song, Yibing
    Shan, Ying
    Liu, Lingqiao
    2023 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2023, : 24172 - 24182
  • [46] Semi-supervised Deep Domain Adaptation via Coupled Neural Networks
    Ding, Zhengming
    Nasrabadi, Nasser M.
    Fu, Yun
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2018, 27 (11) : 5214 - 5224
  • [47] Contrastive Class-aware Adaptation for Domain Generalization
    Chen, Tianle
    Baktashmotlagh, Mahsa
    Salzmann, Mathieu
    2022 26TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2022, : 4871 - 4876
  • [48] Adversarial and Random Transformations for Robust Domain Adaptation and Generalization
    Xiao, Liang
    Xu, Jiaolong
    Zhao, Dawei
    Shang, Erke
    Zhu, Qi
    Dai, Bin
    SENSORS, 2023, 23 (11)
  • [49] Correlation-aware adversarial domain adaptation and generalization
    Rahman, Mohammad Mahfujur
    Fookes, Clinton
    Baktashmotlagh, Mahsa
    Sridharan, Sridha
    PATTERN RECOGNITION, 2020, 100
  • [50] Bagging Adversarial Neural Networks for Domain Adaptation in Non-Stationary EEG
    Raza, Haider
    Samothrakis, Spyridon
    2019 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2019,