Collaborative Multi-Teacher Distillation for Multi-Task Fault Detection in Power Distribution Grid

被引:0
|
作者
Huang, Bingzheng [1 ]
Ni, Chengxin [1 ]
Song, Junjie [1 ]
Yin, Yifan [1 ]
Chen, Ningjiang [1 ]
机构
[1] Guangxi Univ, Sch Comp & Elect Informat, Nanning, Peoples R China
基金
中国国家自然科学基金;
关键词
knowledge distillation; multi-task collaboration; deep learning; fault detection;
D O I
10.1109/CSCWD61410.2024.10580632
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Under the background of complicated fault detection scenarios and diversified data in the current power distribution grid, it is a significant challenge to deploy high-performance fault detection models for efficient multi-task collaboration processing on lightweight edge intelligent devices. A collaborative multi-teacher distillation for multi-task fault detection in the power distribution grid(CMT-KD) is introduced, which enhances multi-task fault detection by redefining Residual Network (ResNet). We further implement model compression and acceleration by adopting the weight quantization method, while enhancing the performance of the multi-task fault detector via dynamically adapting the knowledge weights from multiple teachers. The proposed method allows for the detection of multiple power distribution grid faults with a single model, significantly reducing the number of model parameters, floating point operations, and training time, while also demonstrating favorable performance in terms of improving accuracy and reducing prediction errors.
引用
收藏
页码:2638 / 2643
页数:6
相关论文
共 50 条
  • [31] Multi-task learning for collaborative filtering
    Long, Lianjie
    Huang, Faliang
    Yin, Yunfei
    Xu, Youquan
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2022, 13 (05) : 1355 - 1368
  • [32] Neural multi-task collaborative filtering
    SuHua Wang
    MingJun Cheng
    ZhiQiang Ma
    XiaoXin Sun
    Evolutionary Intelligence, 2022, 15 : 2385 - 2393
  • [33] MIND: Multi-Task Incremental Network Distillation
    Bonato, Jacopo
    Pelosin, Francesco
    Sabetta, Luigi
    Nicolosi, Alessandro
    THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 10, 2024, : 11105 - 11113
  • [34] Let All Be Whitened: Multi-Teacher Distillation for Efficient Visual Retrieval
    Ma, Zhe
    Dong, Jianfeng
    Ji, Shouling
    Liu, Zhenguang
    Zhang, Xuhong
    Wang, Zonghui
    He, Sifeng
    Qian, Feng
    Zhang, Xiaobo
    Yang, Lei
    THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 5, 2024, : 4126 - 4135
  • [35] Online Knowledge Distillation for Multi-task Learning
    Jacob, Geethu Miriam
    Agarwal, Vishal
    Stenger, Bjorn
    2023 IEEE/CVF WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION (WACV), 2023, : 2358 - 2367
  • [36] Building and road detection from remote sensing images based on weights adaptive multi-teacher collaborative distillation using a fused knowledge
    Chen, Ziyi
    Deng, Liai
    Gou, Jing
    Wang, Cheng
    Li, Jonathan
    Li, Dilong
    INTERNATIONAL JOURNAL OF APPLIED EARTH OBSERVATION AND GEOINFORMATION, 2023, 124
  • [37] MTKDSR: Multi-Teacher Knowledge Distillation for Super Resolution Image Reconstruction
    Yao, Gengqi
    Li, Zhan
    Bhanu, Bir
    Kang, Zhiqing
    Zhong, Ziyi
    Zhang, Qingfeng
    2022 26TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2022, : 352 - 358
  • [38] Adversarial Multi-Teacher Distillation for Semi-Supervised Relation Extraction
    Li, Wanli
    Qian, Tieyun
    Li, Xuhui
    Zou, Lixin
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (08) : 11291 - 11301
  • [39] Accurate and efficient protein embedding using multi-teacher distillation learning
    Shang, Jiayu
    Peng, Cheng
    Ji, Yongxin
    Guan, Jiaojiao
    Cai, Dehan
    Tang, Xubo
    Sun, Yanni
    BIOINFORMATICS, 2024, 40 (09)
  • [40] Multi-Location Fault Detection with Piezoelectric Array and Multi-Task CNN Learning
    Chiu, Y. C.
    Lo, Y. C.
    Shu, Y. C.
    ACTIVE AND PASSIVE SMART STRUCTURES AND INTEGRATED SYSTEMS XVIII, 2024, 12946