Domain Knowledge Distillation and Supervised Contrastive Learning for Industrial Process Monitoring

被引:11
|
作者
Ai, Mingxi [1 ,2 ]
Xie, Yongfang [1 ]
Ding, Steven X. X. [2 ]
Tang, Zhaohui [1 ]
Gui, Weihua [1 ]
机构
[1] Cent South Univ, Sch Automat, Changsha 410083, Peoples R China
[2] Univ Duisburg Essen, Inst Automat Control & Complex Syst, D-47057 Duisburg, Germany
关键词
Feature extraction; Process monitoring; Deep learning; Knowledge engineering; Convolutional neural networks; Task analysis; Reliability; Hard negative; industrial process monitoring; knowledge distillation; memory queue-based negative sample augmentation; supervised contrastive learning; HANDCRAFTED FEATURES; IMAGE; FLOTATION;
D O I
10.1109/TIE.2022.3206696
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
To ensure the reliability and safety of modern industrial process monitoring, computer vision-based soft measurement has received considerable attention due to its nonintrusive property. State-of-the-art computer vision-based approaches mostly rely on feature embedding from deep neural networks. However, this kind of feature extraction suffers from noise effects and limitation of labeled training instances, leading to unsatisfactory performance in real industrial process monitoring. In this article, we develop a novel hybrid learning framework for feature representation based on knowledge distillation and supervised contrastive learning. First, we attempt to transfer the abundant semantic information in handcrafted features to deep learning feature-based network by knowledge distillation. Then, to enhance the feature discrimination, supervised contrastive learning is proposed to contrast many positive pairs against many negative pairs per anchor. Meanwhile, two important mechanisms, memory queue-based negative sample augmentation and hard negative sampling, are added into the supervised contrastive learning model to assist the proper selection of negative samples. Finally, a flotation process monitoring problem is considered to illustrate and demonstrate the effectiveness of the proposed method.
引用
收藏
页码:9452 / 9462
页数:11
相关论文
共 50 条
  • [41] Modeling Discriminative Representations for Out-of-Domain Detection with Supervised Contrastive Learning
    Zeng, Zhiyuan
    He, Keqing
    Yan, Yuanmeng
    Liu, Zijun
    Wu, Yanan
    Xu, Hong
    Jiang, Huixing
    Xu, Weiran
    ACL-IJCNLP 2021: THE 59TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS AND THE 11TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING, VOL 2, 2021, : 870 - 878
  • [42] Semi-supervised Domain Adaptation via Joint Contrastive Learning with Sensitivity
    Tu, Keyu
    Wang, Zilei
    Li, Junjie
    Zhang, Yixin
    PROCEEDINGS OF THE 31ST ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA, MM 2023, 2023, : 5645 - 5654
  • [43] Less-supervised learning with knowledge distillation for sperm morphology analysis
    Nabipour, Ali
    Nejati, Mohammad Javad Shams
    Boreshban, Yasaman
    Mirroshandel, Seyed Abolghasem
    COMPUTER METHODS IN BIOMECHANICS AND BIOMEDICAL ENGINEERING-IMAGING AND VISUALIZATION, 2024, 12 (01):
  • [44] Malicious Domain Detection Based on Self-supervised HGNNs with Contrastive Learning
    Li, Zhiping
    Yuan, Fangfang
    Cao, Cong
    Su, Majing
    Lu, Yuhai
    Liu, Yanbing
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING, ICANN 2023, PT III, 2023, 14256 : 62 - 73
  • [45] Semi-supervised domain adaptation on graphs with contrastive learning and minimax entropy
    Xiao, Jiaren
    Dai, Quanyu
    Shen, Xiao
    Xie, Xiaochen
    Dai, Jing
    Lam, James
    Kwok, Ka-Wai
    NEUROCOMPUTING, 2024, 580
  • [46] Distill on the Go: Online knowledge distillation in self-supervised learning
    Bhat, Prashant
    Arani, Elahe
    Zonooz, Bahram
    2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION WORKSHOPS, CVPRW 2021, 2021, : 2672 - 2681
  • [47] Supervised contrastive knowledge graph learning for ncRNA-disease association prediction
    Wang, Yan
    Xie, Xuping
    Wang, Ye
    Sheng, Nan
    Huang, Lan
    Zuo, Chunman
    EXPERT SYSTEMS WITH APPLICATIONS, 2025, 269
  • [48] Quantization via Distillation and Contrastive Learning
    Pei, Zehua
    Yao, Xufeng
    Zhao, Wenqian
    Yu, Bei
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (12) : 17164 - 17176
  • [49] Multilingual Representation Distillation with Contrastive Learning
    Tan, Weiting
    Heffernan, Kevin
    Schwenk, Holger
    Koehn, Philipp
    17TH CONFERENCE OF THE EUROPEAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EACL 2023, 2023, : 1477 - 1490
  • [50] Adaptive Contrastive Knowledge Distillation for BERT Compression
    Guo, Jinyang
    Liu, Jiaheng
    Wang, Zining
    Ma, Yuqing
    Gong, Ruihao
    Xu, Ke
    Liu, Xianglong
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2023), 2023, : 8941 - 8953