Rethinking Maximum Mean Discrepancy for Visual Domain Adaptation

被引:44
|
作者
Wang, Wei [1 ]
Li, Haojie [1 ]
Ding, Zhengming [2 ]
Nie, Feiping [3 ,4 ]
Chen, Junyang [5 ]
Dong, Xiao [6 ]
Wang, Zhihui [1 ]
机构
[1] Dalian Univ Technol, Ritsumeikan Univ DUT RU, Int Sch Informat Sci & Engn, Dalian 116620, Liaoning, Peoples R China
[2] Tulane Univ, Dept Comp Sci, New Orleans, LA 70118 USA
[3] Northwestern Polytech Univ, Sch Comp Sci, Xian 710072, Shanxi, Peoples R China
[4] Northwestern Polytech Univ, Ctr Opt Imagery Anal & Learning OPTIMAL, Xian 710072, Shanxi, Peoples R China
[5] Shenzhen Univ, Coll Comp Sci & Software Engn, Shenzhen 518060, Guangdong, Peoples R China
[6] Sun Yat Sen Univ, Sch Artificial Intelligence, Guangzhou 510006, Guangdong, Peoples R China
基金
中国国家自然科学基金;
关键词
Neural networks; Degradation; Kernel; Computer science; Task analysis; Semantics; Principal component analysis; Discriminability; domain adaptation (DA); inter-class distance; intra-class distance; maximum mean discrepancy (MMD);
D O I
10.1109/TNNLS.2021.3093468
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Existing domain adaptation approaches often try to reduce distribution difference between source and target domains and respect domain-specific discriminative structures by some distribution [e.g., maximum mean discrepancy (MMD)] and discriminative distances (e.g., intra-class and inter-class distances). However, they usually consider these losses together and trade off their relative importance by estimating parameters empirically. It is still under insufficient exploration so far to deeply study their relationships to each other so that we cannot manipulate them correctly and the model's performance degrades. To this end, this article theoretically proves two essential facts: 1) minimizing MMD equals to jointly minimizing their data variance with some implicit weights but, respectively, maximizing the source and target intra-class distances so that feature discriminability degrades and 2) the relationship between intra-class and inter-class distances is as one falls and another rises. Based on this, we propose a novel discriminative MMD with two parallel strategies to correctly restrain the degradation of feature discriminability or the expansion of intra-class distance; specifically: 1) we directly impose a tradeoff parameter on the intra-class distance that is implicit in the MMD according to 1) and 2) we reformulate the inter-class distance with special weights that are analogical to those implicit ones in the MMD and maximizing it can also lead to the intra-class distance falling according to 2). Notably, we do not consider the two strategies in one model due to 2). The experiments on several benchmark datasets not only prove the validity of our revealed theoretical results but also demonstrate that the proposed approach could perform better than some compared state-of-art methods substantially. Our preliminary MATLAB code will be available at https://github.com/WWLoveTransfer/.
引用
收藏
页码:264 / 277
页数:14
相关论文
共 50 条
  • [1] Maximum Mean and Covariance Discrepancy for Unsupervised Domain Adaptation
    Wenju Zhang
    Xiang Zhang
    Long Lan
    Zhigang Luo
    [J]. Neural Processing Letters, 2020, 51 : 347 - 366
  • [2] Maximum Mean and Covariance Discrepancy for Unsupervised Domain Adaptation
    Zhang, Wenju
    Zhang, Xiang
    Lan, Long
    Luo, Zhigang
    [J]. NEURAL PROCESSING LETTERS, 2020, 51 (01) : 347 - 366
  • [3] Metric Information Matrix for Maximum Mean Discrepancy for Domain Adaptation
    Ren, Wenjuan
    Zhou, Shie
    Yang, Zhanpeng
    Shi, Quan
    Sun, Xian
    Yang, Luyi
    [J]. IEEE ACCESS, 2021, 9 : 148017 - 148023
  • [4] Domain adaptation learning based on maximum distribution weighted mean discrepancy
    Zang, Shao-Fei
    Cheng, Yu-Hu
    Wang, Xue-Song
    [J]. Kongzhi yu Juece/Control and Decision, 2016, 31 (11): : 2083 - 2089
  • [5] Weighted and Class-Specific Maximum Mean Discrepancy for Unsupervised Domain Adaptation
    Yan, Hongliang
    Li, Zhetao
    Wang, Qilong
    Li, Peihua
    Xu, Yong
    Zuo, Wangmeng
    [J]. IEEE TRANSACTIONS ON MULTIMEDIA, 2020, 22 (09) : 2420 - 2433
  • [6] Unsupervised Domain Adaptation using Maximum Mean Covariance Discrepancy and Variational Autoencoder
    Barreto, Fabian
    Sarvaiya, Jignesh
    Patnaik, Suprava
    Yadav, Sushilkumar
    [J]. INTERNATIONAL JOURNAL OF ADVANCED COMPUTER SCIENCE AND APPLICATIONS, 2022, 13 (06) : 883 - 891
  • [7] Unsupervised domain adaptation using maximum mean discrepancy optimization for lithology identification
    Chang, Ji
    Li, Jing
    Kang, Yu
    Lv, Wenjun
    Xu, Ting
    Li, Zerui
    Zheng, Wei Xing
    Han, Hongwei
    Liu, Haining
    [J]. GEOPHYSICS, 2021, 86 (02) : ID19 - ID30
  • [8] A Graph Embedding Framework for Maximum Mean Discrepancy-Based Domain Adaptation Algorithms
    Chen, Yiming
    Song, Shiji
    Li, Shuang
    Wu, Cheng
    [J]. IEEE TRANSACTIONS ON IMAGE PROCESSING, 2020, 29 : 199 - 213
  • [9] Extreme Learning Machine Based on Maximum Weighted Mean Discrepancy for Unsupervised Domain Adaptation
    Si, Yanna
    Pu, Jiexin
    Zang, Shaofei
    Sun, Lifan
    [J]. IEEE ACCESS, 2021, 9 : 2283 - 2293
  • [10] Discriminative Joint Probability Maximum Mean Discrepancy (DJP-MMD) for Domain Adaptation
    Zhang, Wen
    Wu, Dongrui
    [J]. 2020 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2020,