Deep Knowledge Distillation Learning for Efficient Wearable Data Mining on the Edge

被引:0
|
作者
Wong, Junhua [1 ,3 ]
Zhang, Qingxue [1 ,2 ,3 ,4 ]
机构
[1] Purdue Univ, Elect & Comp Engn, Indianapolis, IN 46202 USA
[2] Purdue Univ, Biomed Engn, Indianapolis, IN 46202 USA
[3] Purdue Sch Engn & Technol, Elect & Comp Engn, Indianapolis, IN 46202 USA
[4] Purdue Sch Engn & Technol, Biomed Engn, Indianapolis, IN 46202 USA
基金
美国国家科学基金会;
关键词
Knowledge Distillation; Wearable Big Data; Deep Learning; Edge Computing;
D O I
10.1109/ICCE56470.2023.10043546
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Wearable data mining on the edge is essential for consumer electronics and providing real-time insights to the users. However, the challenge lies in the high computing complexity and energy consumption of the algorithms, especially the advanced deep neural networks. Focusing on this challenge, we in this research have proposed a deep knowledge distillation learning algorithm to achieve a light-weight edgedeployable deep learning model. We take a special interest in the Electrocardiogram-based cardiac disease detection application. More specifically, we have firstly designed a heavy teacher model for cardiac disease detection. We then have leveraged the soft target distribution of the teacher model to supervise the training of a lightweight student model. In such a way, the student model can learn the knowledge from the teacher model, with an energy-efficient structure that has significantly less parameters. Evaluated on the cardiac disease detection task, our framework has demonstrated promising effectiveness, and this study will therefore greatly advance efficient wearable data mining on the edge in consumer electronics.
引用
收藏
页数:3
相关论文
共 50 条
  • [1] KNOWLEDGE DISTILLATION FOR WIRELESS EDGE LEARNING
    Mohamed, Ahmed P.
    Fameel, Abu Shafin Mohammad Mandee
    El Gamal, Aly
    [J]. 2021 IEEE STATISTICAL SIGNAL PROCESSING WORKSHOP (SSP), 2021, : 600 - 604
  • [2] Topological Knowledge Distillation for Wearable Sensor Data
    Jeon, Eun Som
    Choi, Hongjun
    Shukla, Ankita
    Wang, Yuan
    Buman, Matthew P.
    Turaga, Pavan
    [J]. 2022 56TH ASILOMAR CONFERENCE ON SIGNALS, SYSTEMS, AND COMPUTERS, 2022, : 837 - 842
  • [3] Self Regulated Learning Mechanism for Data Efficient Knowledge Distillation
    Mishra, Sourav
    Sundaram, Suresh
    [J]. 2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [4] An Efficient and Robust Cloud-Based Deep Learning With Knowledge Distillation
    Tao, Zeyi
    Xia, Qi
    Cheng, Songqing
    Li, Qun
    [J]. IEEE TRANSACTIONS ON CLOUD COMPUTING, 2023, 11 (02) : 1733 - 1745
  • [5] A Survey of Knowledge Distillation in Deep Learning
    Shao R.-R.
    Liu Y.-A.
    Zhang W.
    Wang J.
    [J]. Jisuanji Xuebao/Chinese Journal of Computers, 2022, 45 (08): : 1638 - 1673
  • [6] A data mining model of knowledge discovery based on the deep learning
    Ma, Yonglin
    Tan, Yuanhua
    Zhang, Chaolin
    Mao, Yici
    [J]. PROCEEDINGS OF THE 2015 10TH IEEE CONFERENCE ON INDUSTRIAL ELECTRONICS AND APPLICATIONS, 2015, : 1217 - 1221
  • [7] Neuron Manifold Distillation for Edge Deep Learning
    Tao, Zeyi
    Xia, Qi
    Li, Qun
    [J]. 2021 IEEE/ACM 29TH INTERNATIONAL SYMPOSIUM ON QUALITY OF SERVICE (IWQOS), 2021,
  • [8] Wearable Big Data Pertinence Learning with Deep Spatiotemporal co-Mining
    Wong, Junhua
    Zhang, Qingxue
    [J]. 2022 IEEE INTERNATIONAL INSTRUMENTATION AND MEASUREMENT TECHNOLOGY CONFERENCE (I2MTC 2022), 2022,
  • [9] Role of Data Augmentation Strategies in Knowledge Distillation for Wearable Sensor Data
    Jeon, Eun Som
    Som, Anirudh
    Shukla, Ankita
    Hasanaj, Kristina
    Buman, Matthew P.
    Turaga, Pavan
    [J]. IEEE INTERNET OF THINGS JOURNAL, 2021, 9 (14): : 12848 - 12860
  • [10] Knowledge distillation in deep learning and its applications
    Alkhulaifi, Abdolmaged
    Alsahli, Fahad
    Ahmad, Irfan
    [J]. PEERJ COMPUTER SCIENCE, 2021, PeerJ Inc. (07) : 1 - 24