Neuron Manifold Distillation for Edge Deep Learning

被引:0
|
作者
Tao, Zeyi [1 ]
Xia, Qi [1 ]
Li, Qun [1 ]
机构
[1] William & Mary, Dept Comp Sci, Williamsburg, VA 23185 USA
来源
2021 IEEE/ACM 29TH INTERNATIONAL SYMPOSIUM ON QUALITY OF SERVICE (IWQOS) | 2021年
基金
美国国家科学基金会;
关键词
Cloud Computing; Edge Computing; Machine Learning; Manifold Learning; Dimension Reduction; NONLINEAR DIMENSIONALITY REDUCTION;
D O I
10.1109/IWQOS52092.2021.9521267
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Although deep neural networks show their extraordinary power in various object detection tasks, it is very challenging for them to be deployed on resource constrained devices or embedded systems due to their high computational cost. Efforts such as model partition, pruning or quantization have been used at an expense of accuracy loss. Recently proposed knowledge distillation (KD) aims at transferring model knowledge from a well-trained model (teacher) to a smaller and faster model (student), which can significantly reduce the computational cost, memory usage, and prolong the battery lifetime. In this work, we propose a novel neuron manifold distillation (NMD), where the student models not only imitate teacher's output activations, but also learn the feature geometry structure of the teacher. Our approach produces a high-quality, compact, and lightweight student model. We conduct comprehensive experiments with different distillation configurations over multiple datasets, and the proposed method demonstrates a consistent improvement in accuracy-speed trade-offs for the distilled model.
引用
收藏
页数:10
相关论文
共 50 条
  • [1] Deep Knowledge Distillation Learning for Efficient Wearable Data Mining on the Edge
    Wong, Junhua
    Zhang, Qingxue
    2023 IEEE INTERNATIONAL CONFERENCE ON CONSUMER ELECTRONICS, ICCE, 2023,
  • [2] EdgePro: Edge Deep Learning Model Protection via Neuron Authorization
    Chen, Jinyin
    Zheng, Haibin
    Liu, Tao
    Liu, Jiawei
    Cheng, Yao
    Zhang, Xuhong
    Ji, Shouling
    IEEE TRANSACTIONS ON DEPENDABLE AND SECURE COMPUTING, 2024, 21 (05) : 4967 - 4981
  • [3] DEEP DISCRIMINATIVE MANIFOLD LEARNING
    Chien, Jen-Tzung
    Chen, Ching-Huai
    2016 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING PROCEEDINGS, 2016, : 2672 - 2676
  • [4] KNOWLEDGE DISTILLATION FOR WIRELESS EDGE LEARNING
    Mohamed, Ahmed P.
    Fameel, Abu Shafin Mohammad Mandee
    El Gamal, Aly
    2021 IEEE STATISTICAL SIGNAL PROCESSING WORKSHOP (SSP), 2021, : 600 - 604
  • [5] Manifold Learning of Brain MRIs by Deep Learning
    Brosch, Tom
    Tam, Roger
    MEDICAL IMAGE COMPUTING AND COMPUTER-ASSISTED INTERVENTION - MICCAI 2013, PT II, 2013, 8150 : 633 - 640
  • [6] Neuron segmentation with deep learning
    Nina Vogt
    Nature Methods, 2019, 16 : 460 - 460
  • [7] Neuron segmentation with deep learning
    Vogt, Nina
    NATURE METHODS, 2019, 16 (06) : 460 - 460
  • [8] Bayesian Distillation of Deep Learning Models
    A. V. Grabovoy
    V. V. Strijov
    Automation and Remote Control, 2021, 82 : 1846 - 1856
  • [9] A Review of Dataset Distillation for Deep Learning
    Thi-Thu-Huong Le
    Larasati, Harashta Tatimma
    Prihatno, Aji Teguh
    Kim, Howon
    2022 INTERNATIONAL CONFERENCE ON PLATFORM TECHNOLOGY AND SERVICE (PLATCON22), 2022, : 34 - 37
  • [10] Bayesian Distillation of Deep Learning Models
    Grabovoy, A. V.
    Strijov, V. V.
    AUTOMATION AND REMOTE CONTROL, 2021, 82 (11) : 1846 - 1856