BAG-OF-FEATURES-BASED KNOWLEDGE DISTILLATION FOR LIGHTWEIGHT CONVOLUTIONAL NEURAL NETWORKS

被引:0
|
作者
Chariton, Alexandros [1 ]
Passalis, Nikolaos [1 ]
Tefas, Anastasios [1 ]
机构
[1] Aristotle Univ Thessaloniki, Dept Informat, Computat Intelligence & Deep Learning Grp, AIIA Lab, Thessaloniki, Greece
关键词
Knowledge Distillation; Bag-of-Features; Mutual Information; Convolutional Neural Networks;
D O I
10.1109/ICIP46576.2022.9897390
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Knowledge distillation enables us to transfer the knowledge from a large and complex neural network into a smaller and faster one. This allows for improving the accuracy of the smaller network. However, directly transferring the knowledge between enormous feature maps, as they are extracted from convolutional layers, is not straightforward. In this work, we propose an efficient mutual information-based approach for transferring the knowledge between feature maps extracted from different networks. The proposed method employs an efficient Neural Bag-of-Features formulation to estimate the joint and marginal probabilities and then optimizes the whole pipeline in an end-to-end manner. The effectiveness of the proposed method is demonstrated using a lightweight, fully convolutional neural network architecture, which aims toward high-resolution analysis and targets photonic neural network accelerators.
引用
收藏
页码:1541 / 1545
页数:5
相关论文
共 50 条
  • [1] Training Lightweight Deep Convolutional Neural Networks Using Bag-of-Features Pooling
    Passalis, Nikolaos
    Tefas, Anastasios
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2019, 30 (06) : 1705 - 1715
  • [2] Lightweight convolutional neural network with knowledge distillation for cervical cells classification
    Chen, Wen
    Gao, Liang
    Li, Xinyu
    Shen, Weiming
    BIOMEDICAL SIGNAL PROCESSING AND CONTROL, 2022, 71
  • [3] Effective training of convolutional neural networks for age estimation based on knowledge distillation
    Antonio Greco
    Alessia Saggese
    Mario Vento
    Vincenzo Vigilante
    Neural Computing and Applications, 2022, 34 : 21449 - 21464
  • [4] Effective training of convolutional neural networks for age estimation based on knowledge distillation
    Greco, Antonio
    Saggese, Alessia
    Vento, Mario
    Vigilante, Vincenzo
    NEURAL COMPUTING & APPLICATIONS, 2022, 34 (24): : 21449 - 21464
  • [5] Knowledge distillation circumvents nonlinearity for optical convolutional neural networks
    Xiang, Jinlin
    Colburn, Shane
    Majumdar, Arka
    Shlizerman, Eli
    APPLIED OPTICS, 2022, 61 (09) : 2173 - 2183
  • [6] A Lightweight Method for Graph Neural Networks Based on Knowledge Distillation and Graph Contrastive Learning
    Wang, Yong
    Yang, Shuqun
    APPLIED SCIENCES-BASEL, 2024, 14 (11):
  • [7] LightweightNet: Toward fast and lightweight convolutional neural networks via architecture distillation
    Xu, Ting-Bing
    Yang, Peipei
    Zhang, Xu-Yao
    Liu, Cheng-Lin
    PATTERN RECOGNITION, 2019, 88 : 272 - 284
  • [8] Learning Bag-of-Features Pooling for Deep Convolutional Neural Networks
    Passalis, Nikolaos
    Tefas, Anastasios
    2017 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV), 2017, : 5766 - 5774
  • [9] Face Recognition Based on Lightweight Convolutional Neural Networks
    Liu, Wenting
    Zhou, Li
    Chen, Jie
    INFORMATION, 2021, 12 (05)
  • [10] Knowledge distillation with ensembles of convolutional neural networks for medical image segmentation
    Noothout, Julia M. H.
    Lessmann, Nikolas
    van Eede, Matthijs C.
    van Harten, Louis D.
    Sogancioglu, Ecem
    Heslinga, Friso G.
    Veta, Mitko
    van Ginneken, Bram
    Isgum, Ivana
    JOURNAL OF MEDICAL IMAGING, 2022, 9 (05)