Representation Learning and Knowledge Distillation for Lightweight Domain Adaptation

被引:0
|
作者
Bin Shah, Sayed Rafay [1 ]
Putty, Shreyas Subhash [1 ]
Schwung, Andreas [1 ]
机构
[1] South Westphalia Univ Appl Sci, Dept Elect Power Engn, Soest, Germany
关键词
Unsupervised domain adaptation; maximum mean discrepancy; knowledge distillation; representation learning; remaining useful lifetime estimation; C-MAPSS;
D O I
10.1109/CAI59869.2024.00214
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In industrial machine learning applications, insufficient data, lack of labeling, distribution shift between subsets, varying operational conditions, etc. result in poor generalizing performance by pre-trained neural network models across domains. In contrast to image detection tasks, time series dataset contain critical domain-specific characteristics that must be learned by the corresponding networks. Naively aligning the learned representations during the adaptation process increases the risk of losing these key information, thus resulting in poor performance. This paper proposes a lightweight domain adaptation method involving representation learning and knowledge distillation (RepLKD). A separate network is pre-trained to learn valuable information from the target data in its latent space with the help of a reconstructor. In the adaptation stage, we use maximum mean discrepancy to minimize the difference in distributions between the source and target latent space. Additionally, we implement knowledge distillation to encourage the target network to generate source-like latent embedding and penalize only when an upper-bound condition is not fulfilled to prevent over-regularization and loss of domain-specific features. Finally, we test our proposed method on 12 cross-domain scenarios with the C-MAPSS dataset and compare the efficacy of our method against existing literature methods.
引用
收藏
页码:1202 / 1207
页数:6
相关论文
共 50 条
  • [1] Domain adaptation and knowledge distillation for lightweight pavement crack detection
    Xiao, Tianhao
    Pang, Rong
    Liu, Huijun
    Yang, Chunhua
    Li, Ao
    Niu, Chenxu
    Ruan, Zhimin
    Xu, Ling
    Ge, Yongxin
    EXPERT SYSTEMS WITH APPLICATIONS, 2025, 263
  • [2] A Federated Domain Adaptation Algorithm Based on Knowledge Distillation and Contrastive Learning
    HUANG Fang
    FANG Zhijun
    SHI Zhicai
    ZHUANG Lehui
    LI Xingchen
    HUANG Bo
    WuhanUniversityJournalofNaturalSciences, 2022, 27 (06) : 499 - 507
  • [3] Knowledge Distillation-Based Domain-Invariant Representation Learning for Domain Generalization
    Niu, Ziwei
    Yuan, Junkun
    Ma, Xu
    Xu, Yingying
    Liu, Jing
    Chen, Yen-Wei
    Tong, Ruofeng
    Lin, Lanfen
    IEEE TRANSACTIONS ON MULTIMEDIA, 2024, 26 : 245 - 255
  • [4] Learning Lightweight Face Detector with Knowledge Distillation
    Jin, Haibo
    Zhang, Shifeng
    Zhu, Xiangyu
    Tang, Yinhang
    Lei, Zhen
    Li, Stan Z.
    2019 INTERNATIONAL CONFERENCE ON BIOMETRICS (ICB), 2019,
  • [5] Knowledge distillation for BERT unsupervised domain adaptation
    Ryu, Minho
    Lee, Geonseok
    Lee, Kichun
    KNOWLEDGE AND INFORMATION SYSTEMS, 2022, 64 (11) : 3113 - 3128
  • [6] Knowledge distillation for BERT unsupervised domain adaptation
    Minho Ryu
    Geonseok Lee
    Kichun Lee
    Knowledge and Information Systems, 2022, 64 : 3113 - 3128
  • [7] Sequence-aware Knowledge Distillation for a Lightweight Event Representation
    Zheng, Jianming
    Cai, Fei
    Ling, Yanxiang
    Chen, Honghui
    ACM TRANSACTIONS ON INFORMATION SYSTEMS, 2023, 41 (02)
  • [8] Continual Learning Based on Knowledge Distillation and Representation Learning
    Chen, Xiu-Yan
    Liu, Jian-Wei
    Li, Wen-Tao
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2022, PT IV, 2022, 13532 : 27 - 38
  • [9] XKD: Cross-Modal Knowledge Distillation with Domain Alignment for Video Representation Learning
    Sarkar, Pritam
    Etemad, Ali
    THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 13, 2024, : 14875 - 14885
  • [10] LEARNING LIGHTWEIGHT PEDESTRIAN DETECTOR WITH HIERARCHICAL KNOWLEDGE DISTILLATION
    Chen, Rui
    Ai, Haizhou
    Shang, Chong
    Chen, Long
    Zhuang, Zijie
    2019 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2019, : 1645 - 1649