Multi-metric domain adaptation for unsupervised transfer learning

被引:5
|
作者
Yang, Hongwei [1 ]
He, Hui [1 ]
Li, Tao [1 ]
Bai, Yawen [1 ]
Zhang, Weizhe [1 ,2 ]
机构
[1] Harbin Inst Technol, Sch Comp Sci & Technol, Harbin, Peoples R China
[2] Peng Cheng Lab, Shenzhen, Peoples R China
基金
中国国家自然科学基金; 国家重点研发计划;
关键词
unsupervised learning; pattern classification; unlabelled target domain; labelled source domain; global transfer perspectives; local transfer perspectives; multimetric domain adaptation; MMDA; unsupervised transfer learning; marginal class distances; cross-domain adaptability; cross-domain manifold structures; global adaptation methods; local adaptation methods; unsupervised domain adaptation; FRAMEWORK; REGULARIZATION;
D O I
10.1049/iet-ipr.2019.1434
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Unsupervised domain adaptation aims to learn a classifier for the unlabelled target domain by leveraging knowledge from a labelled source domain. This study presents a novel domain adaptation framework from global and local transfer perspectives, referred to as multi-metric domain adaptation (MMDA) for unsupervised transfer learning. At the global level, MMDA minimises the marginal and within-class distances and maximises the between-class distance between domains while maintaining the features of the source domain to improve the cross-domain adaptability. At the local level, MMDA exploits both in- and cross-domain manifold structures embedded in data samples to increase the discriminative ability. The authors learn a coupled transformation that projects the source and target domain data onto respective subspace where the statistical and geometrical divergences are reduced simultaneously. They formulate global and local adaptation methods in an optimisation problem and derive an analytic solution to the objective function. Extensive experiments demonstrate that MMDA shows improvements in classification accuracy compared with several existing state-of-the-art methods.
引用
收藏
页码:2780 / 2790
页数:11
相关论文
共 50 条
  • [1] Transfer metric learning for unsupervised domain adaptation
    Huang, Junchu
    Zhou, Zhiheng
    [J]. IET IMAGE PROCESSING, 2019, 13 (05) : 804 - 810
  • [2] Joint metric and feature representation learning for unsupervised domain adaptation
    Xie, Yue
    Du, Zhekai
    Li, Jingjing
    Jing, Mengmeng
    Chen, Erpeng
    Lu, Ke
    [J]. KNOWLEDGE-BASED SYSTEMS, 2020, 192
  • [3] Robust two stage unsupervised metric learning for domain adaptation
    Azarbarzin, Samaneh
    Afsari, Fatemeh
    [J]. 2018 8TH INTERNATIONAL CONFERENCE ON COMPUTER AND KNOWLEDGE ENGINEERING (ICCKE), 2018, : 52 - 57
  • [4] Consolidation: Multi-metric Learning and Its Applications For Cross-domain Recommendation
    Guan, Chu
    Liu, Qi
    Lv, Jingsong
    Chen, Enhong
    Zhu, Hengshu
    Li, Xin
    [J]. 2015 IEEE/WIC/ACM INTERNATIONAL CONFERENCE ON WEB INTELLIGENCE AND INTELLIGENT AGENT TECHNOLOGY (WI-IAT), VOL 1, 2015, : 244 - 251
  • [5] Multi-metric learning by a pair of twin-metric learning framework
    Min Zhang
    Liming Yang
    Chao Yuan
    Qiangqiang Ren
    [J]. Applied Intelligence, 2022, 52 : 17490 - 17507
  • [6] Multi-metric learning by a pair of twin-metric learning framework
    Zhang, Min
    Yang, Liming
    Yuan, Chao
    Ren, Qiangqiang
    [J]. APPLIED INTELLIGENCE, 2022, 52 (15) : 17490 - 17507
  • [7] An efficient multi-metric learning method by partitioning the metric space
    Yuan, Chao
    Yang, Liming
    [J]. NEUROCOMPUTING, 2023, 529 : 56 - 79
  • [8] An efficient method for clustered multi-metric learning
    Bac Nguyen
    Ferri, Francesc J.
    Morell, Carlos
    De Baets, Bernard
    [J]. INFORMATION SCIENCES, 2019, 471 : 149 - 163
  • [9] A coarse-to-fine unsupervised domain adaptation method based on metric learning
    Peng, Yaxin
    Yang, Keni
    Zhao, Fangrong
    Shen, Chaomin
    Zhang, Yangchun
    [J]. JOURNAL OF INTELLIGENT & FUZZY SYSTEMS, 2024, 46 (01) : 3013 - 3027
  • [10] Burg Matrix Divergence Based Multi-Metric Learning
    Wang, Yan
    Li, Han-Xiong
    [J]. ECAI 2016: 22ND EUROPEAN CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2016, 285 : 1553 - 1554