Domain Knowledge Distillation and Supervised Contrastive Learning for Industrial Process Monitoring

被引:11
|
作者
Ai, Mingxi [1 ,2 ]
Xie, Yongfang [1 ]
Ding, Steven X. X. [2 ]
Tang, Zhaohui [1 ]
Gui, Weihua [1 ]
机构
[1] Cent South Univ, Sch Automat, Changsha 410083, Peoples R China
[2] Univ Duisburg Essen, Inst Automat Control & Complex Syst, D-47057 Duisburg, Germany
关键词
Feature extraction; Process monitoring; Deep learning; Knowledge engineering; Convolutional neural networks; Task analysis; Reliability; Hard negative; industrial process monitoring; knowledge distillation; memory queue-based negative sample augmentation; supervised contrastive learning; HANDCRAFTED FEATURES; IMAGE; FLOTATION;
D O I
10.1109/TIE.2022.3206696
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
To ensure the reliability and safety of modern industrial process monitoring, computer vision-based soft measurement has received considerable attention due to its nonintrusive property. State-of-the-art computer vision-based approaches mostly rely on feature embedding from deep neural networks. However, this kind of feature extraction suffers from noise effects and limitation of labeled training instances, leading to unsatisfactory performance in real industrial process monitoring. In this article, we develop a novel hybrid learning framework for feature representation based on knowledge distillation and supervised contrastive learning. First, we attempt to transfer the abundant semantic information in handcrafted features to deep learning feature-based network by knowledge distillation. Then, to enhance the feature discrimination, supervised contrastive learning is proposed to contrast many positive pairs against many negative pairs per anchor. Meanwhile, two important mechanisms, memory queue-based negative sample augmentation and hard negative sampling, are added into the supervised contrastive learning model to assist the proper selection of negative samples. Finally, a flotation process monitoring problem is considered to illustrate and demonstrate the effectiveness of the proposed method.
引用
收藏
页码:9452 / 9462
页数:11
相关论文
共 50 条
  • [11] Knowledge Distillation for Semi-supervised Domain Adaptation
    Orbes-Arteainst, Mauricio
    Cardoso, Jorge
    Sorensen, Lauge
    Igel, Christian
    Ourselin, Sebastien
    Modat, Marc
    Nielsen, Mads
    Pai, Akshay
    OR 2.0 CONTEXT-AWARE OPERATING THEATERS AND MACHINE LEARNING IN CLINICAL NEUROIMAGING, 2019, 11796 : 68 - 76
  • [12] MULTI-VIEW CONTRASTIVE LEARNING FOR ONLINE KNOWLEDGE DISTILLATION
    Yang, Chuanguang
    An, Zhulin
    Xu, Yongjun
    2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021), 2021, : 3750 - 3754
  • [13] Cross-domain knowledge transfer in industrial process monitoring: A survey
    Chai, Zheng
    Zhao, Chunhui
    Huang, Biao
    JOURNAL OF PROCESS CONTROL, 2025, 149
  • [14] CLDA: Contrastive Learning for Semi-Supervised Domain Adaptation
    Singh, Ankit
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [15] Supervised Contrastive Learning
    Khosla, Prannay
    Teterwak, Piotr
    Wang, Chen
    Sarna, Aaron
    Tian, Yonglong
    Isola, Phillip
    Maschinot, Aaron
    Liu, Ce
    Krishnan, Dilip
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
  • [16] SSCL: Semi-supervised Contrastive Learning for Industrial Anomaly Detection
    Cai, Wei
    Gao, Jiechao
    PATTERN RECOGNITION AND COMPUTER VISION, PRCV 2023, PT IV, 2024, 14428 : 100 - 112
  • [17] Leveraging Contrastive Learning and Knowledge Distillation for Incomplete Modality Rumor Detection
    Xul, Fan
    Fan, Pinyun
    Huang, Qi
    Zou, Bowei
    Awe, AiTi
    Wang, Mingwen
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (EMNLP 2023), 2023, : 13492 - 13503
  • [18] Balanced Knowledge Distillation with Contrastive Learning for Document Re-ranking
    Yang, Yingrui
    He, Shanxiu
    Qiao, Yifan
    Xie, Wentai
    Yang, Tao
    PROCEEDINGS OF THE 2023 ACM SIGIR INTERNATIONAL CONFERENCE ON THE THEORY OF INFORMATION RETRIEVAL, ICTIR 2023, 2023, : 247 - 255
  • [19] Incorporating Domain Knowledge Graph into Multimodal Movie Genre Classification with Self-Supervised Attention and Contrastive Learning
    Li, Jiaqi
    Qi, Guilin
    Zhang, Chuanyi
    Chen, Yongrui
    Tan, Yiming
    Xia, Chenlong
    Tian, Ye
    PROCEEDINGS OF THE 31ST ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA, MM 2023, 2023, : 3337 - 3345
  • [20] Improving Structural and Semantic Global Knowledge in Graph Contrastive Learning with Distillation
    Wen, Mi
    Wang, Hongwei
    Xue, Yunsheng
    Wu, Yi
    Wen, Hong
    ADVANCES IN KNOWLEDGE DISCOVERY AND DATA MINING, PT II, PAKDD 2024, 2024, 14646 : 364 - 375