Transfer channel pruning for compressing deep domain adaptation models

被引:9
|
作者
Yu, Chaohui [1 ,2 ]
Wang, Jindong [3 ]
Chen, Yiqiang [1 ,2 ]
Qin, Xin [1 ,2 ]
机构
[1] Chinese Acad Sci, Beijing Key Lab Mobile Comp & Pervas Device, Inst Comp Technol, Beijing, Peoples R China
[2] Univ Chinese Acad Sci, Beijing, Peoples R China
[3] Microsoft Res Asia, Beijing, Peoples R China
关键词
Unsupervised domain adaptation; Transfer channel pruning; Accelerating;
D O I
10.1007/s13042-019-01004-6
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Deep unsupervised domain adaptation has recently received increasing attention from researchers. However, existing methods are computationally intensive due to the computational cost of convolutional neural networks (CNN) adopted by most work. There is no effective network compression method for such problem. In this paper, we propose a unified transfer channel pruning (TCP) method for accelerating deep unsupervised domain adaptation (UDA) models. TCP method is capable of compressing the deep UDA model by pruning less important channels while simultaneously learning transferable features by reducing the cross-domain distribution divergence. Therefore, it reduces the impact of negative transfer and maintains competitive performance on the target task. To the best of our knowledge, TCP method is the first approach that aims at accelerating deep unsupervised domain adaptation models. TCP method is validated on two main kinds of UDA methods: the discrepancy-based methods and the adversarial-based methods. In addition, it is validated on two benchmark datasets: Office-31 and ImageCLEF-DA with two common backbone networks - VGG16 and ResNet50. Experimental results demonstrate that our TCP method achieves comparable or better classification accuracy than other comparison methods while significantly reducing the computational cost. To be more specific, in VGG16, we get even higher accuracy after pruning 26% floating point operations (FLOPs); in ResNet50, we also get higher accuracy on half of the tasks after pruning 12% FLOPs for both discrepancy-based methods and adversarial-based methods.
引用
收藏
页码:3129 / 3144
页数:16
相关论文
共 50 条
  • [41] Inter-Vendor Compatibility and Transfer Learning for MR-Based Synthetic CT Deep Learning Models for Domain Adaptation
    Klages, P.
    Tyagi, N.
    Veeraraghavan, H.
    MEDICAL PHYSICS, 2020, 47 (06) : E429 - E429
  • [42] Differential evolution based layer-wise weight pruning for compressing deep neural networks
    Wu, Tao
    Li, Xiaoyang
    Zhou, Deyun
    Li, Na
    Shi, Jiao
    Sensors (Switzerland), 2021, 21 (03): : 1 - 20
  • [43] Differential Evolution Based Layer-Wise Weight Pruning for Compressing Deep Neural Networks
    Wu, Tao
    Li, Xiaoyang
    Zhou, Deyun
    Li, Na
    Shi, Jiao
    SENSORS, 2021, 21 (03) : 1 - 20
  • [44] Linearly Replaceable Filters for Deep Network Channel Pruning
    Joo, Donggyu
    Yi, Eojindl
    Baek, Sunghyun
    Kim, Junmo
    THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 8021 - 8029
  • [45] Conditional Automated Channel Pruning for Deep Neural Networks
    Liu, Yixin
    Guo, Yong
    Guo, Jiaxin
    Jiang, Luoqian
    Chen, Jian
    IEEE SIGNAL PROCESSING LETTERS, 2021, 28 : 1275 - 1279
  • [46] Channel Pruning for Accelerating Very Deep Neural Networks
    He, Yihui
    Zhang, Xiangyu
    Sun, Jian
    2017 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV), 2017, : 1398 - 1406
  • [47] A new deep transfer learning method for intelligent bridge damage diagnosis based on muti-channel sub-domain adaptation
    Xiao, Haitao
    Ogai, Harutoshi
    Wang, Wenjie
    STRUCTURE AND INFRASTRUCTURE ENGINEERING, 2024, 20 (12) : 1994 - 2009
  • [48] Spectral Pruning: Compressing Deep Neural Networks via Spectral Analysis and its Generalization Error
    Suzuki, Taiji
    Abe, Hiroshi
    Murata, Tomoya
    Horiuchi, Shingo
    Ito, Kotaro
    Wachi, Tokuma
    Hirai, So
    Yukishima, Masatoshi
    Nishimura, Tomoaki
    PROCEEDINGS OF THE TWENTY-NINTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2020, : 2839 - 2846
  • [49] Compressing Deep Models using Multi Tensor Train Decomposition
    Yang, Xin
    Sun, Weize
    Huang, Lei
    Chen, Shaowu
    ICCAIS 2019: THE 8TH INTERNATIONAL CONFERENCE ON CONTROL, AUTOMATION AND INFORMATION SCIENCES, 2019,
  • [50] Incomplete Multi-view Domain Adaptation via Channel Enhancement and Knowledge Transfer
    Xia, Haifeng
    Wang, Pu
    Ding, Zhengming
    COMPUTER VISION, ECCV 2022, PT XXXIV, 2022, 13694 : 200 - 217