Collaborative Multi-Teacher Distillation for Multi-Task Fault Detection in Power Distribution Grid

被引:0
|
作者
Huang, Bingzheng [1 ]
Ni, Chengxin [1 ]
Song, Junjie [1 ]
Yin, Yifan [1 ]
Chen, Ningjiang [1 ]
机构
[1] Guangxi Univ, Sch Comp & Elect Informat, Nanning, Peoples R China
基金
中国国家自然科学基金;
关键词
knowledge distillation; multi-task collaboration; deep learning; fault detection;
D O I
10.1109/CSCWD61410.2024.10580632
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Under the background of complicated fault detection scenarios and diversified data in the current power distribution grid, it is a significant challenge to deploy high-performance fault detection models for efficient multi-task collaboration processing on lightweight edge intelligent devices. A collaborative multi-teacher distillation for multi-task fault detection in the power distribution grid(CMT-KD) is introduced, which enhances multi-task fault detection by redefining Residual Network (ResNet). We further implement model compression and acceleration by adopting the weight quantization method, while enhancing the performance of the multi-task fault detector via dynamically adapting the knowledge weights from multiple teachers. The proposed method allows for the detection of multiple power distribution grid faults with a single model, significantly reducing the number of model parameters, floating point operations, and training time, while also demonstrating favorable performance in terms of improving accuracy and reducing prediction errors.
引用
收藏
页码:2638 / 2643
页数:6
相关论文
共 50 条
  • [1] MULTI-TEACHER DISTILLATION FOR INCREMENTAL OBJECT DETECTION
    Jiang, Le
    Cheng, Hongqiang
    Ye, Xiaozhou
    Ouyang, Ye
    2024 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING, ICASSP 2024, 2024, : 5520 - 5524
  • [2] Anomaly detection based on multi-teacher knowledge distillation
    Ma, Ye
    Jiang, Xu
    Guan, Nan
    Yi, Wang
    JOURNAL OF SYSTEMS ARCHITECTURE, 2023, 138
  • [3] Adaptive multi-teacher multi-level knowledge distillation
    Liu, Yuang
    Zhang, Wei
    Wang, Jun
    Neurocomputing, 2021, 415 : 106 - 113
  • [4] Correlation Guided Multi-teacher Knowledge Distillation
    Shi, Luyao
    Jiang, Ning
    Tang, Jialiang
    Huang, Xinlei
    NEURAL INFORMATION PROCESSING, ICONIP 2023, PT IV, 2024, 14450 : 562 - 574
  • [5] Adaptive multi-teacher multi-level knowledge distillation
    Liu, Yuang
    Zhang, Wei
    Wang, Jun
    NEUROCOMPUTING, 2020, 415 : 106 - 113
  • [6] Reinforced Multi-Teacher Selection for Knowledge Distillation
    Yuan, Fei
    Shou, Linjun
    Pei, Jian
    Lin, Wutao
    Gong, Ming
    Fu, Yan
    Jiang, Daxin
    THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 14284 - 14291
  • [7] Knowledge Distillation via Multi-Teacher Feature Ensemble
    Ye, Xin
    Jiang, Rongxin
    Tian, Xiang
    Zhang, Rui
    Chen, Yaowu
    IEEE SIGNAL PROCESSING LETTERS, 2024, 31 : 566 - 570
  • [8] CONFIDENCE-AWARE MULTI-TEACHER KNOWLEDGE DISTILLATION
    Zhang, Hailin
    Chen, Defang
    Wang, Can
    2022 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2022, : 4498 - 4502
  • [9] Named Entity Recognition Method Based on Multi-Teacher Collaborative Cyclical Knowledge Distillation
    Jin, Chunqiao
    Yang, Shuangyuan
    PROCEEDINGS OF THE 2024 27 TH INTERNATIONAL CONFERENCE ON COMPUTER SUPPORTED COOPERATIVE WORK IN DESIGN, CSCWD 2024, 2024, : 230 - 235
  • [10] Bi-Level Orthogonal Multi-Teacher Distillation
    Gong, Shuyue
    Wen, Weigang
    ELECTRONICS, 2024, 13 (16)