A Method for Recovering on Unsupervised Domain Adaptation Models Compression

被引:0
|
作者
Wang, Shou-Ping [1 ]
Chen, Erh-Chung [1 ]
Yang, Meng-Hsuan [1 ]
Lee, Chc-Rung [1 ]
机构
[1] Natl Tsing Hua Univ, Dept Comp Sci, Hsinchu, Taiwan
关键词
Model compression; Domain adaptation; Unsupervised learning; Image classification; NEURAL-NETWORKS;
D O I
10.1109/CSCI62032.2023.00016
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Unsupervised domain adaptation (UDA) aims to transfer learned knowledge from a source domain with labeled data, to a target domain with unlabeled data. For practical uses, UDA models are usually compressed to fit into smaller devices. However, previous work on UDA compression cannot obtain satisfactory accuracy after the compression and tine tuning Our study shows one of the major reason is the growth of domain discrepancy during the tine tuning process. To recover the model accuracy for the target domain data, we propose a two-step tine -tune strategy: the first step is like ordinary tuning process whose goal is to reduce the classification error; the second step is designed to suppress the growth of domain discrepancy. We propose a sampling technique to estimate domain discrepancy of the pruned model, and use it in the loss function of the second step. We validated the performance of our method on ImageCLEF-DA, Office -31 and Office -home datasets. The results show that our method reaches higher average accuracy than previous work, especially for unbalanced dataset, such as Office 31.
引用
收藏
页码:57 / 63
页数:7
相关论文
共 50 条
  • [1] UNSUPERVISED DOMAIN ADAPTATION WITH COPULA MODELS
    Tran, Cuong D.
    Rudovic, Ognjen
    Pavlovic, Vladimir
    2017 IEEE 27TH INTERNATIONAL WORKSHOP ON MACHINE LEARNING FOR SIGNAL PROCESSING, 2017,
  • [2] Unsupervised Domain Adaptation of Language Models for Reading Comprehension
    Nishida, Kosuke
    Nishida, Kyosuke
    Saito, Itsumi
    Asano, Hisako
    Tomita, Junji
    PROCEEDINGS OF THE 12TH INTERNATIONAL CONFERENCE ON LANGUAGE RESOURCES AND EVALUATION (LREC 2020), 2020, : 5392 - 5399
  • [3] Unsupervised domain adaptation based on the predictive uncertainty of models
    Lee, JoonHo
    Lee, Gyemin
    NEUROCOMPUTING, 2023, 520 : 183 - 193
  • [4] Iterative knowledge distillation and pruning for model compression in unsupervised domain adaptation
    Wang, Zhiyuan
    Shi, Long
    Mei, Zhen
    Zhao, Xiang
    Wang, Zhe
    Li, Jun
    PATTERN RECOGNITION, 2025, 164
  • [5] A Knowledge Transfer Method for Unsupervised Pose Keypoint Detection Based on Domain Adaptation and CAD Models
    Du, Fuzhou
    Kong, Feifei
    Zhao, Delong
    ADVANCED INTELLIGENT SYSTEMS, 2023, 5 (02)
  • [6] Effective Unsupervised Domain Adaptation with Adversarially Trained Language Models
    Vu, Thuy-Trang
    Phung, Dinh
    Haffari, Gholamreza
    PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 6163 - 6173
  • [7] CLDA: an adversarial unsupervised domain adaptation method with classifier-level adaptation
    He, Zhihai
    Yang, Bo
    Chen, Chaoxian
    Mu, Qilin
    Li, Zesong
    MULTIMEDIA TOOLS AND APPLICATIONS, 2020, 79 (45-46) : 33973 - 33991
  • [8] CLDA: an adversarial unsupervised domain adaptation method with classifier-level adaptation
    Zhihai He
    Bo Yang
    Chaoxian Chen
    Qilin Mu
    Zesong Li
    Multimedia Tools and Applications, 2020, 79 : 33973 - 33991
  • [9] Unsupervised Domain Adaptation Method Based on Discriminant Sample Selection
    基于判别性样本选择的无监督领域自适应方法
    2020, Northwestern Polytechnical University (38): : 828 - 837
  • [10] AN UNSUPERVISED DOMAIN ADAPTATION METHOD FOR COMPRESSED VIDEO QUALITY ENHANCEMENT
    Wang Zeyang
    2022 19TH INTERNATIONAL COMPUTER CONFERENCE ON WAVELET ACTIVE MEDIA TECHNOLOGY AND INFORMATION PROCESSING (ICCWAMTIP), 2022,