Budget Restricted Incremental Learning with Pre-Trained Convolutional Neural Networks and Binary Associative Memories

被引:6
|
作者
Hacene, Ghouthi Boukli [1 ]
Gripon, Vincent [1 ]
Farrugia, Nicolas [1 ]
Arzel, Matthieu [1 ]
Jezequel, Michel [1 ]
机构
[1] IMT Atlantique, Brest, France
关键词
Computer vision; Deep learning; Transfer learning; Incremental learning; Learning on chip; ALGORITHM;
D O I
10.1007/s11265-019-01450-z
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
For the past few years, Deep Neural Networks (DNNs) have achieved state-of-art performance in numerous challenging domains. To reach this performance, DNNs consist in large sets of parameters and complex architectures, which are trained offline on huge datasets. The complexity and size of DNNs architectures make it difficult to implement such approaches for budget-restricted applications such as embedded systems. Furthermore, DNNs cannot learn incrementally new data, without forgetting previously acquired knowledge, which makes embedded applications even more challenging due to the need of storing the whole dataset. To tackle this problem, we introduce an incremental learning method that combines pre-trained DNNs, binary associative memories, and product quantizing (PQ) as a bridge between them. The obtained method requires less computational power and memory requirements, and reaches good performances on challenging vision datasets. Moreover, we present a hardware implementation validated on a FPGA target, which uses few hardware resources, while providing substantial processing acceleration compared to a CPU counterpart.
引用
收藏
页码:1063 / 1073
页数:11
相关论文
共 50 条
  • [1] Budget Restricted Incremental Learning with Pre-Trained Convolutional Neural Networks and Binary Associative Memories
    Hacene, Ghouthi Boukli
    Gripon, Vincent
    Farrugia, Nicolas
    Arzel, Matthieu
    Jezequel, Michel
    2017 IEEE INTERNATIONAL WORKSHOP ON SIGNAL PROCESSING SYSTEMS (SIPS), 2017,
  • [2] Budget Restricted Incremental Learning with Pre-Trained Convolutional Neural Networks and Binary Associative Memories
    Ghouthi Boukli Hacene
    Vincent Gripon
    Nicolas Farrugia
    Matthieu Arzel
    Michel Jezequel
    Journal of Signal Processing Systems, 2019, 91 : 1063 - 1073
  • [3] Pre-trained Convolutional Neural Networks for the Lung Sounds Classification
    Vaityshyn, Valentyn
    Porieva, Hanna
    Makarenkova, Anastasiia
    2019 IEEE 39TH INTERNATIONAL CONFERENCE ON ELECTRONICS AND NANOTECHNOLOGY (ELNANO), 2019, : 522 - 525
  • [4] Transfer learning with pre-trained deep convolutional neural networks for serous cell classification
    Baykal, Elif
    Dogan, Hulya
    Ercin, Mustafa Emre
    Ersoz, Safak
    Ekinci, Murat
    MULTIMEDIA TOOLS AND APPLICATIONS, 2020, 79 (21-22) : 15593 - 15611
  • [5] Transfer learning with pre-trained deep convolutional neural networks for serous cell classification
    Elif Baykal
    Hulya Dogan
    Mustafa Emre Ercin
    Safak Ersoz
    Murat Ekinci
    Multimedia Tools and Applications, 2020, 79 : 15593 - 15611
  • [6] Dynamic Convolutional Neural Networks as Efficient Pre-Trained Audio Models
    Schmid, Florian
    Koutini, Khaled
    Widmer, Gerhard
    IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING, 2024, 32 : 2227 - 2241
  • [7] Classification of Deepfake Videos Using Pre-trained Convolutional Neural Networks
    Masood, MomMa
    Nawaz, Marriam
    Javed, Ali
    Nazir, Tahira
    Mehmood, Awais
    Mahum, Rabbia
    2021 INTERNATIONAL CONFERENCE ON DIGITAL FUTURES AND TRANSFORMATIVE TECHNOLOGIES (ICODT2), 2021,
  • [8] Performance Improvement Of Pre-trained Convolutional Neural Networks For Action Recognition
    Ozcan, Tayyip
    Basturk, Alper
    COMPUTER JOURNAL, 2021, 64 (11): : 1715 - 1730
  • [9] Automatic variogram inference using pre-trained Convolutional Neural Networks
    Karim, Mokdad
    Behrang, Koushavand
    Jeff, Boisvert
    APPLIED COMPUTING AND GEOSCIENCES, 2025, 25
  • [10] Pre-trained convolutional neural networks as feature extractors for tuberculosis detection
    Lopes, U. K.
    Valiati, J. F.
    COMPUTERS IN BIOLOGY AND MEDICINE, 2017, 89 : 135 - 143