Domain Knowledge Distillation and Supervised Contrastive Learning for Industrial Process Monitoring

被引:11
|
作者
Ai, Mingxi [1 ,2 ]
Xie, Yongfang [1 ]
Ding, Steven X. X. [2 ]
Tang, Zhaohui [1 ]
Gui, Weihua [1 ]
机构
[1] Cent South Univ, Sch Automat, Changsha 410083, Peoples R China
[2] Univ Duisburg Essen, Inst Automat Control & Complex Syst, D-47057 Duisburg, Germany
关键词
Feature extraction; Process monitoring; Deep learning; Knowledge engineering; Convolutional neural networks; Task analysis; Reliability; Hard negative; industrial process monitoring; knowledge distillation; memory queue-based negative sample augmentation; supervised contrastive learning; HANDCRAFTED FEATURES; IMAGE; FLOTATION;
D O I
10.1109/TIE.2022.3206696
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
To ensure the reliability and safety of modern industrial process monitoring, computer vision-based soft measurement has received considerable attention due to its nonintrusive property. State-of-the-art computer vision-based approaches mostly rely on feature embedding from deep neural networks. However, this kind of feature extraction suffers from noise effects and limitation of labeled training instances, leading to unsatisfactory performance in real industrial process monitoring. In this article, we develop a novel hybrid learning framework for feature representation based on knowledge distillation and supervised contrastive learning. First, we attempt to transfer the abundant semantic information in handcrafted features to deep learning feature-based network by knowledge distillation. Then, to enhance the feature discrimination, supervised contrastive learning is proposed to contrast many positive pairs against many negative pairs per anchor. Meanwhile, two important mechanisms, memory queue-based negative sample augmentation and hard negative sampling, are added into the supervised contrastive learning model to assist the proper selection of negative samples. Finally, a flotation process monitoring problem is considered to illustrate and demonstrate the effectiveness of the proposed method.
引用
收藏
页码:9452 / 9462
页数:11
相关论文
共 50 条
  • [21] A Medical Image Segmentation Method Combining Knowledge Distillation and Contrastive Learning
    Ma, Xiaoxuan
    Shan, Sihan
    Sui, Dong
    Journal of Computers (Taiwan), 2024, 35 (03) : 363 - 377
  • [22] Online Knowledge Distillation via Mutual Contrastive Learning for Visual Recognition
    Yang, Chuanguang
    An, Zhulin
    Zhou, Helong
    Zhuang, Fuzhen
    Xu, Yongjun
    Zhang, Qian
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2023, 45 (08) : 10212 - 10227
  • [23] FedRCIL: Federated Knowledge Distillation for Representation based Contrastive Incremental Learning
    Psaltis, Athanasios
    Chatzikonstantinou, Christos
    Patrikakis, Charalampos Z.
    Daras, Petros
    2023 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION WORKSHOPS, ICCVW, 2023, : 3455 - 3464
  • [24] Semi-supervised contrastive learning for flotation process monitoring with uncertainty-aware prototype optimization
    Ai, Mingxi
    Zhang, Jin
    Li, Peng
    Wu, Jiande
    Tang, Zhaohui
    Xie, Yongfang
    ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2025, 145
  • [25] Self-supervised knowledge distillation in counterfactual learning for VQA
    Bi, Yandong
    Jiang, Huajie
    Zhang, Hanfu
    Hu, Yongli
    Yin, Baocai
    PATTERN RECOGNITION LETTERS, 2024, 177 : 33 - 39
  • [26] Self-supervised knowledge distillation for complementary label learning
    Liu, Jiabin
    Li, Biao
    Lei, Minglong
    Shi, Yong
    NEURAL NETWORKS, 2022, 155 : 318 - 327
  • [27] Continual Learning with Semi-supervised Contrastive Distillation for Incremental Neural Machine Translation
    Liang, Yunlong
    Meng, Fandong
    Wang, Jiaan
    Xu, Jinan
    Chen, Yufeng
    Zhou, Jie
    PROCEEDINGS OF THE 62ND ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, VOL 1: LONG PAPERS, 2024, : 10914 - 10928
  • [28] Toward Understanding the Feature Learning Process of Self-supervised Contrastive Learning
    Wen, Zixin
    Li, Yuanzhi
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139
  • [29] STABLE KNOWLEDGE TRANSFER FOR CONTRASTIVE DISTILLATION
    Tang, Qiankun
    2024 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING, ICASSP 2024, 2024, : 4995 - 4999
  • [30] Supervised contrastive learning for recommendation
    Yang, Chun
    Zou, Jianxiao
    Wu, JianHua
    Xu, Hongbing
    Fan, Shicai
    KNOWLEDGE-BASED SYSTEMS, 2022, 258