Structure-Preserved Multi-Source Domain Adaptation

被引:0
|
作者
Liu, Hongfu [1 ]
Shao, Ming [2 ]
Fu, Yun [1 ,3 ]
机构
[1] Northeastern Univ, Dept Elect & Comp Sci, Boston, MA 02115 USA
[2] Univ Massachusetts Dartmouth, Coll Engn, Dartmouth, NS, Canada
[3] Northeastern Univ, Coll Comp & Informat Sci, Boston, MA 02115 USA
基金
美国国家科学基金会;
关键词
Transfer Learning; Multi-Source Domain Adaptation; Constraint Clustering;
D O I
10.1109/ICDM.2016.38
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Domain adaptation has achieved promising results in many areas, such as image classification and object recognition. Although a lot of algorithms have been proposed to solve the task with different domain distributions, it remains a challenge for multi-source unsupervised domain adaptation. In addition, most of the existing algorithms learn a classifier on the source domain and predict the labels for the target data, which indicates that only the knowledge derived from the hyperplane is transferred to the target domain and the structure information is ignored. In light of this, we propose a novel algorithm for multi-source unsupervised domain adaptation. Generally speaking, we aim to preserve the whole structure from source domains and transfer it to serve the task on the target domain. The source and target data are put together for clustering, which simultaneously explores the structures of the source and target domains. The structure-preserved information from source domain further guides the clustering process on the target domain. Extensive experiments on two widely used databases on object recognition and face identification show the substantial improvement of our proposed approach over several state-of-the-art methods. Especially, our algorithm can take use of multi-source domains and achieve robust and better performance compared with the single source domain adaptation methods.
引用
收藏
页码:1059 / 1064
页数:6
相关论文
共 50 条
  • [1] Structure-Preserved Unsupervised Domain Adaptation
    Liu, Hongfu
    Shao, Ming
    Ding, Zhengming
    Fu, Yun
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2019, 31 (04) : 799 - 812
  • [2] Multilevel structure-preserved GAN for domain adaptation in intravascular ultrasound analysis
    Xia, Menghua
    Yang, Hongbo
    Qu, Yanan
    Guo, Yi
    Zhou, Guohui
    Zhang, Feng
    Wang, Yuanyuan
    MEDICAL IMAGE ANALYSIS, 2022, 82
  • [3] A survey of multi-source domain adaptation
    Sun, Shiliang
    Shi, Honglei
    Wu, Yuanbin
    INFORMATION FUSION, 2015, 24 : 84 - 92
  • [4] Multi-Source Distilling Domain Adaptation
    Zhao, Sicheng
    Wang, Guangzhi
    Zhang, Shanghang
    Gu, Yang
    Li, Yaxian
    Song, Zhichao
    Xu, Pengfei
    Hu, Runbo
    Chai, Hua
    Keutzer, Kurt
    THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 12975 - 12983
  • [5] BAYESIAN MULTI-SOURCE DOMAIN ADAPTATION
    Sun, Shi-Liang
    Shi, Hong-Lei
    PROCEEDINGS OF 2013 INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND CYBERNETICS (ICMLC), VOLS 1-4, 2013, : 24 - 28
  • [6] Multi-Source Survival Domain Adaptation
    Shaker, Ammar
    Lawrence, Carolin
    THIRTY-SEVENTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 37 NO 8, 2023, : 9752 - 9762
  • [8] Wasserstein Barycenter for Multi-Source Domain Adaptation
    Montesuma, Eduardo Fernandes
    Mboula, Fred Maurice Ngole
    2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR 2021, 2021, : 16780 - 16788
  • [9] Multi-source Domain Adaptation for Semantic Segmentation
    Zhao, Sicheng
    Li, Bo
    Yue, Xiangyu
    Gu, Yang
    Xu, Pengfei
    Hu, Runbo
    Chai, Hua
    Keutzer, Kurt
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [10] Multi-Source Contribution Learning for Domain Adaptation
    Li, Keqiuyin
    Lu, Jie
    Zuo, Hua
    Zhang, Guangquan
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2022, 33 (10) : 5293 - 5307