Representation Learning and Knowledge Distillation for Lightweight Domain Adaptation

被引:0
|
作者
Bin Shah, Sayed Rafay [1 ]
Putty, Shreyas Subhash [1 ]
Schwung, Andreas [1 ]
机构
[1] South Westphalia Univ Appl Sci, Dept Elect Power Engn, Soest, Germany
关键词
Unsupervised domain adaptation; maximum mean discrepancy; knowledge distillation; representation learning; remaining useful lifetime estimation; C-MAPSS;
D O I
10.1109/CAI59869.2024.00214
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In industrial machine learning applications, insufficient data, lack of labeling, distribution shift between subsets, varying operational conditions, etc. result in poor generalizing performance by pre-trained neural network models across domains. In contrast to image detection tasks, time series dataset contain critical domain-specific characteristics that must be learned by the corresponding networks. Naively aligning the learned representations during the adaptation process increases the risk of losing these key information, thus resulting in poor performance. This paper proposes a lightweight domain adaptation method involving representation learning and knowledge distillation (RepLKD). A separate network is pre-trained to learn valuable information from the target data in its latent space with the help of a reconstructor. In the adaptation stage, we use maximum mean discrepancy to minimize the difference in distributions between the source and target latent space. Additionally, we implement knowledge distillation to encourage the target network to generate source-like latent embedding and penalize only when an upper-bound condition is not fulfilled to prevent over-regularization and loss of domain-specific features. Finally, we test our proposed method on 12 cross-domain scenarios with the C-MAPSS dataset and compare the efficacy of our method against existing literature methods.
引用
收藏
页码:1202 / 1207
页数:6
相关论文
共 50 条
  • [21] Domain Generalization via Switch Knowledge Distillation for Robust Review Representation
    Zhang, You
    Wang, Jin
    Yu, Liang-Chih
    Xu, Dan
    Zhang, Xuejie
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2023), 2023, : 12812 - 12826
  • [22] Iterative knowledge distillation and pruning for model compression in unsupervised domain adaptation
    Wang, Zhiyuan
    Shi, Long
    Mei, Zhen
    Zhao, Xiang
    Wang, Zhe
    Li, Jun
    PATTERN RECOGNITION, 2025, 164
  • [23] CXR Segmentation by AdaIN-Based Domain Adaptation and Knowledge Distillation
    Oh, Yujin
    Ye, Jong Chul
    COMPUTER VISION, ECCV 2022, PT XXI, 2022, 13681 : 627 - 643
  • [24] Unsupervised Multi-Target Domain Adaptation Through Knowledge Distillation
    Nguyen-Meidine, L. T.
    Belal, A.
    Kiran, M.
    Dolz, J.
    Blais-Morin, L-A
    Granger, E.
    2021 IEEE WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION (WACV 2021), 2021, : 1338 - 1346
  • [25] Ensemble Learning of Lightweight Deep Learning Models Using Knowledge Distillation for Image Classification
    Kang, Jaeyong
    Gwak, Jeonghwan
    MATHEMATICS, 2020, 8 (10)
  • [26] Disentangled Representation Learning with Causality for Unsupervised Domain Adaptation
    Wang, Shanshan
    Chen, Yiyang
    He, Zhenwei
    Yang, Xun
    Wang, Mengzhu
    You, Quanzeng
    Zhang, Xingyi
    PROCEEDINGS OF THE 31ST ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA, MM 2023, 2023, : 2918 - 2926
  • [27] Representation learning via serial autoencoders for domain adaptation
    Yang, Shuai
    Zhang, Yuhong
    Zhu, Yi
    Li, Peipei
    Hu, Xuegang
    NEUROCOMPUTING, 2019, 351 : 1 - 9
  • [28] Deep Learning of Transferable Representation for Scalable Domain Adaptation
    Long, Mingsheng
    Wang, Jianmin
    Cao, Yue
    Sun, Jiaguang
    Yu, Philip S.
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2016, 28 (08) : 2027 - 2040
  • [29] Wasserstein Distance Guided Representation Learning for Domain Adaptation
    Shen, Jian
    Qu, Yanru
    Zhang, Weinan
    Yu, Yong
    THIRTY-SECOND AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTIETH INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / EIGHTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2018, : 4058 - 4065
  • [30] Unsupervised domain adaptation via representation learning and adaptive classifier learning
    Gheisari, Marzieh
    Baghshah, Mandieh Soleymani
    NEUROCOMPUTING, 2015, 165 : 300 - 311