Representation Learning and Knowledge Distillation for Lightweight Domain Adaptation

被引:0
|
作者
Bin Shah, Sayed Rafay [1 ]
Putty, Shreyas Subhash [1 ]
Schwung, Andreas [1 ]
机构
[1] South Westphalia Univ Appl Sci, Dept Elect Power Engn, Soest, Germany
关键词
Unsupervised domain adaptation; maximum mean discrepancy; knowledge distillation; representation learning; remaining useful lifetime estimation; C-MAPSS;
D O I
10.1109/CAI59869.2024.00214
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In industrial machine learning applications, insufficient data, lack of labeling, distribution shift between subsets, varying operational conditions, etc. result in poor generalizing performance by pre-trained neural network models across domains. In contrast to image detection tasks, time series dataset contain critical domain-specific characteristics that must be learned by the corresponding networks. Naively aligning the learned representations during the adaptation process increases the risk of losing these key information, thus resulting in poor performance. This paper proposes a lightweight domain adaptation method involving representation learning and knowledge distillation (RepLKD). A separate network is pre-trained to learn valuable information from the target data in its latent space with the help of a reconstructor. In the adaptation stage, we use maximum mean discrepancy to minimize the difference in distributions between the source and target latent space. Additionally, we implement knowledge distillation to encourage the target network to generate source-like latent embedding and penalize only when an upper-bound condition is not fulfilled to prevent over-regularization and loss of domain-specific features. Finally, we test our proposed method on 12 cross-domain scenarios with the C-MAPSS dataset and compare the efficacy of our method against existing literature methods.
引用
收藏
页码:1202 / 1207
页数:6
相关论文
共 50 条
  • [41] Joint predictive model and representation learning for visual domain adaptation
    Gheisari, Marzieh
    Baghshah, Mandieh Soleymani
    ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2017, 58 : 157 - 170
  • [42] Unsupervised Domain Adaptation in the Wild via Disentangling Representation Learning
    Li, Haoliang
    Wan, Renjie
    Wang, Shiqi
    Kot, Alex C.
    INTERNATIONAL JOURNAL OF COMPUTER VISION, 2021, 129 (02) : 267 - 283
  • [43] Domain Adaptation with Invariant Representation Learning: What Transformations to Learn?
    Stojanov, Petar
    Li, Zijian
    Gong, Mingming
    Cai, Ruichu
    Carbonell, Jaime G.
    Zhang, Kun
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [44] A lightweight hierarchical graph convolutional model for knowledge graph representation learning
    Zhang, Jinglin
    Shen, Bo
    APPLIED INTELLIGENCE, 2024, 54 (21) : 10695 - 10708
  • [45] Domain Adaptation with Representation Learning and Nonlinear Relation for Time Series
    Hussein, Amir
    Hajj, Hazem
    ACM TRANSACTIONS ON INTERNET OF THINGS, 2022, 3 (02):
  • [46] Unsupervised Domain Adaptation in the Wild via Disentangling Representation Learning
    Haoliang Li
    Renjie Wan
    Shiqi Wang
    Alex C. Kot
    International Journal of Computer Vision, 2021, 129 : 267 - 283
  • [47] Domain Adaptation Meets Disentangled Representation Learning and Style Transfer
    Vu-Hoang Tran
    Huang, Ching-Chun
    2019 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN AND CYBERNETICS (SMC), 2019, : 2998 - 3005
  • [48] Representation learning via serial robust autoencoder for domain adaptation
    Yang, Shuai
    Zhang, Yuhong
    Wang, Hao
    Li, Peipei
    Hu, Xuegang
    EXPERT SYSTEMS WITH APPLICATIONS, 2020, 160
  • [49] Joint metric and feature representation learning for unsupervised domain adaptation
    Xie, Yue
    Du, Zhekai
    Li, Jingjing
    Jing, Mengmeng
    Chen, Erpeng
    Lu, Ke
    KNOWLEDGE-BASED SYSTEMS, 2020, 192
  • [50] Domain Adaptation for Graph Representation Learning: Challenges, Progress, and Prospects
    Shi, Bo-Shen
    Wang, Yong-Qing
    Guo, Fang-Da
    Xu, Bing-Bing
    Shen, Hua-Wei
    Cheng, Xue-Qi
    JOURNAL OF COMPUTER SCIENCE AND TECHNOLOGY, 2025,