Learning With Sharing: An Edge-Optimized Incremental Learning Method for Deep Neural Networks

被引:3
|
作者
Hussain, Muhammad Awais [1 ]
Huang, Shih-An [1 ]
Tsai, Tsung-Han [2 ]
机构
[1] Natl Cent Univ, Elect Engn, Taoyuan 320, Taiwan
[2] Natl Cent Univ, Dept Elect Engn, Taoyuan 320, Taiwan
关键词
Incremental learning; deep neural networks; learning on-chip; energy-efficient learning; network sharing;
D O I
10.1109/TETC.2022.3210905
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Incremental learning techniques aim to increase the capability of Deep Neural Network (DNN) model to add new classes in the pre-trained model. However, DNNs suffer from catastrophic forgetting during the incremental learning process. Existing incremental learning techniques try to reduce the effect of catastrophic forgetting by either using previous samples of data while adding new classes in the model or designing complex model architectures. This leads to high design complexity and memory requirements which make incremental learning impossible to implement on edge devices that have limited memory and computation resources. So, we propose a new incremental learning technique Learning with Sharing (LwS) based on the concept of transfer learning. The main aims of LwS are reduction in training complexity and storage memory requirements while achieving high accuracy during the incremental learning process. We perform cloning and sharing of full connected (FC) layers to add new classes in the model incrementally. Our proposed technique can preserve the knowledge of existing classes and add new classes without storing data from the previous classes. We show that our proposed technique outperforms the state-of-the-art techniques in accuracy comparison for Cifar-100, Caltech-101, and UCSD Birds datasets.
引用
收藏
页码:461 / 473
页数:13
相关论文
共 50 条
  • [1] An Edge-Optimized Incremental Learning Algorithm For Audio Classification
    Tsai, Tsung-Han
    Hussain, Muhammad Awais
    Lee, Chun-Lin
    [J]. 2022 IEEE INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE CIRCUITS AND SYSTEMS (AICAS 2022): INTELLIGENT TECHNOLOGY IN THE POST-PANDEMIC ERA, 2022, : 504 - 504
  • [2] Learning Automata Based Incremental Learning Method for Deep Neural Networks
    Guo, Haonan
    Wang, Shilin
    Fan, Jianxun
    Li, Shenghong
    [J]. IEEE ACCESS, 2019, 7 : 41164 - 41171
  • [3] Incremental Learning in Deep Convolutional Neural Networks Using Partial Network Sharing
    Sarwar, Syed Shakib
    Ankit, Aayush
    Roy, Kaushik
    [J]. IEEE ACCESS, 2020, 8 : 4615 - 4628
  • [4] Analysis of Edge-Optimized Deep Learning Classifiers for Radar-Based Gesture Recognition
    Chmurski, Mateusz
    Zubert, Mariusz
    Bierzynski, Kay
    Santra, Avik
    [J]. IEEE ACCESS, 2021, 9 : 74406 - 74421
  • [5] An Incremental Learning Method for Neural Networks in Adaptive Environments
    Perez-Sanchez, Beatriz
    Fontenla-Romero, Oscar
    Guijarro-Berdinas, Bertha
    [J]. 2010 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS IJCNN 2010, 2010,
  • [6] Optimized Quantization for Convolutional Deep Neural Networks in Federated Learning
    Kim, You Jun
    Hong, Choong Seon
    [J]. APNOMS 2020: 2020 21ST ASIA-PACIFIC NETWORK OPERATIONS AND MANAGEMENT SYMPOSIUM (APNOMS), 2020, : 150 - 154
  • [7] An Incremental Learning Method for Neural Networks Based on Sensitivity Analysis
    Perez-Sanchez, Beatriz
    Fontenla-Romero, Oscar
    Guijarro-Berdinas, Bertha
    [J]. CURRENT TOPICS IN ARTIFICIAL INTELLIGENCE, 2010, 5988 : 42 - 50
  • [8] Random sketch learning for deep neural networks in edge computing
    Li, Bin
    Chen, Peijun
    Liu, Hongfu
    Guo, Weisi
    Cao, Xianbin
    Du, Junzhao
    Zhao, Chenglin
    Zhang, Jun
    [J]. NATURE COMPUTATIONAL SCIENCE, 2021, 1 (03): : 221 - 228
  • [9] Compressed Superposition of Neural Networks for Deep Learning in Edge Computing
    Zeman, Marko
    Osipov, Evgeny
    Bosnic, Zoran
    [J]. 2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [10] Random sketch learning for deep neural networks in edge computing
    Bin Li
    Peijun Chen
    Hongfu Liu
    Weisi Guo
    Xianbin Cao
    Junzhao Du
    Chenglin Zhao
    Jun Zhang
    [J]. Nature Computational Science, 2021, 1 : 221 - 228