Transfer channel pruning for compressing deep domain adaptation models

被引:9
|
作者
Yu, Chaohui [1 ,2 ]
Wang, Jindong [3 ]
Chen, Yiqiang [1 ,2 ]
Qin, Xin [1 ,2 ]
机构
[1] Chinese Acad Sci, Beijing Key Lab Mobile Comp & Pervas Device, Inst Comp Technol, Beijing, Peoples R China
[2] Univ Chinese Acad Sci, Beijing, Peoples R China
[3] Microsoft Res Asia, Beijing, Peoples R China
关键词
Unsupervised domain adaptation; Transfer channel pruning; Accelerating;
D O I
10.1007/s13042-019-01004-6
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Deep unsupervised domain adaptation has recently received increasing attention from researchers. However, existing methods are computationally intensive due to the computational cost of convolutional neural networks (CNN) adopted by most work. There is no effective network compression method for such problem. In this paper, we propose a unified transfer channel pruning (TCP) method for accelerating deep unsupervised domain adaptation (UDA) models. TCP method is capable of compressing the deep UDA model by pruning less important channels while simultaneously learning transferable features by reducing the cross-domain distribution divergence. Therefore, it reduces the impact of negative transfer and maintains competitive performance on the target task. To the best of our knowledge, TCP method is the first approach that aims at accelerating deep unsupervised domain adaptation models. TCP method is validated on two main kinds of UDA methods: the discrepancy-based methods and the adversarial-based methods. In addition, it is validated on two benchmark datasets: Office-31 and ImageCLEF-DA with two common backbone networks - VGG16 and ResNet50. Experimental results demonstrate that our TCP method achieves comparable or better classification accuracy than other comparison methods while significantly reducing the computational cost. To be more specific, in VGG16, we get even higher accuracy after pruning 26% floating point operations (FLOPs); in ResNet50, we also get higher accuracy on half of the tasks after pruning 12% FLOPs for both discrepancy-based methods and adversarial-based methods.
引用
收藏
页码:3129 / 3144
页数:16
相关论文
共 50 条
  • [31] Compressing Deep Reinforcement Learning Networks With a Dynamic Structured Pruning Method for Autonomous Driving
    Su, Wensheng
    Li, Zhenni
    Xu, Minrui
    Kang, Jiawen
    Niyato, Dusit
    Xie, Shengli
    IEEE TRANSACTIONS ON VEHICULAR TECHNOLOGY, 2024, 73 (12) : 18017 - 18030
  • [32] OPQ: Compressing Deep Neural Networks with One-shot Pruning-Quantization
    Hu, Peng
    Peng, Xi
    Zhu, Hongyuan
    Aly, Mohamed M. Sabry
    Lin, Jie
    THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 7780 - 7788
  • [33] On Compressing Deep Models by Low Rank and Sparse Decomposition
    Yu, Xiyu
    Liu, Tongliang
    Wang, Xinchao
    Tao, Dacheng
    30TH IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2017), 2017, : 67 - 76
  • [34] Compressing Deep Image Super-resolution Models
    Jiang, Yuxuan
    Nawala, Jakub
    Zhang, Fan
    Bull, David
    2024 PICTURE CODING SYMPOSIUM, PCS 2024, 2024,
  • [35] Video surveillance using deep transfer learning and deep domain adaptation: Towards better generalization
    Himeur, Yassine
    Al-Maadeed, Somaya
    Kheddar, Hamza
    Al-Maadeed, Noor
    Abualsaud, Khalid
    Mohamed, Amr
    Khattab, Tamer
    ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2023, 119
  • [36] Deep Visual Domain Adaptation
    Csurka, Gabriela
    2020 22ND INTERNATIONAL SYMPOSIUM ON SYMBOLIC AND NUMERIC ALGORITHMS FOR SCIENTIFIC COMPUTING (SYNASC 2020), 2020, : 1 - 8
  • [37] Deep Discriminative Domain Adaptation
    Zhang, Changchun
    Zhao, Qingjie
    INFORMATION SCIENCES, 2021, 575 : 599 - 610
  • [38] DEEP CLUSTERING FOR DOMAIN ADAPTATION
    Gao, Boyan
    Yang, Yongxin
    Gouk, Henry
    Hospedales, Timothy M.
    2020 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, 2020, : 4247 - 4251
  • [39] Fisher Deep Domain Adaptation
    Zhang, Yinghua
    Zhang, Yu
    Wei, Ying
    Bai, Kun
    Song, Yangqiu
    Yang, Qiang
    PROCEEDINGS OF THE 2020 SIAM INTERNATIONAL CONFERENCE ON DATA MINING (SDM), 2020, : 469 - 477
  • [40] A Conceptual Framework for Pruning of Deep Learning Models
    Smarts, Nyalalani
    Selvaraj, Rajalakshmi
    Kuthadi, Venumadhav
    SSRN, 2023,