Training Deep Face Recognition for Efficient Inference by Distillation and Mutual Learning

被引:0
|
作者
Shen, Guodong [1 ]
Shen, Yao [1 ]
RiaZ, M. Naveed [1 ]
机构
[1] Shanghai Jiao Tong Univ, Dept Comp Sci & Engn, Shanghai, Peoples R China
关键词
face recognition; model distillation; mutual learning; deep learning;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Currently most of deep face recognition algorithms utilize heavy networks to achieve the state-of-the-art performance. In most scenarios, the more challenging task is to achieve the relative high accuracy with low computational cost especially for embedded devices. In this paper, we propose a lightweight network for face recognition using distillation and deep mutual learning In proposed methods a new indicator is designed to monitor the model convergence and an assessment criteria is developed to evaluate the Labeled Faces in the Wild(LFW) dataset. Experiments show that our models work better than networks trained directly and other mobile face recognition solutions.
引用
收藏
页码:38 / 43
页数:6
相关论文
共 50 条
  • [1] Light Deep Face Recognition based on Knowledge Distillation and Adversarial Training
    Liu, Jinjin
    Li, Xiaonan
    [J]. 2022 INTERNATIONAL CONFERENCE ON MECHANICAL, AUTOMATION AND ELECTRICAL ENGINEERING, CMAEE, 2022, : 127 - 132
  • [2] TRIPLET DISTILLATION FOR DEEP FACE RECOGNITION
    Feng, Yushu
    Wang, Huan
    Hu, Haoji
    Yu, Lu
    Wang, Wei
    Wang, Shiyan
    [J]. 2020 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2020, : 808 - 812
  • [3] An efficient Deep Learning of scarce and diverse faces for Face recognition
    Mankar, Preeti P.
    Hajare, Pratik R.
    Agarkar, Poonam T.
    Bondre, Shweta G.
    Giradkar, Narendra P.
    [J]. 2019 9TH INTERNATIONAL CONFERENCE ON EMERGING TRENDS IN ENGINEERING AND TECHNOLOGY: SIGNAL AND INFORMATION PROCESSING (ICETET-SIP-19), 2019,
  • [4] AN EFFICIENT DEEP NEURAL NETWORKS TRAINING FRAMEWORK FOR ROBUST FACE RECOGNITION
    Su, Canping
    Yan, Yan
    Chen, Si
    Wang, Hanzi
    [J]. 2017 24TH IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2017, : 3800 - 3804
  • [5] Cost Efficient GPU Cluster Management for Training and Inference of Deep Learning
    Kang, Dong-Ki
    Lee, Ki-Beom
    Kim, Young-Chon
    [J]. ENERGIES, 2022, 15 (02)
  • [6] Improving Deep Mutual Learning via Knowledge Distillation
    Lukman, Achmad
    Yang, Chuan-Kai
    [J]. APPLIED SCIENCES-BASEL, 2022, 12 (15):
  • [7] An Efficient Face Recognition Algorithm Based on Deep Learning for Unmanned Supermarket
    Zhou, Fo Zhi
    Wan, Guo Chun
    Kuang, Yong Kang
    Tong, Mei Song
    [J]. 2018 PROGRESS IN ELECTROMAGNETICS RESEARCH SYMPOSIUM (PIERS-TOYAMA), 2018, : 715 - 718
  • [8] MIND-Net: A Deep Mutual Information Distillation Network for Realistic Low-Resolution Face Recognition
    Low, Cheng-Yaw
    Teoh, Andrew Beng-Jin
    Park, Jaewoo
    [J]. IEEE SIGNAL PROCESSING LETTERS, 2021, 28 : 354 - 358
  • [9] Sparsity in Deep Learning: Pruning and growth for efficient inference and training in neural networks
    Hoefler, Torsten
    Alistarh, Dan
    Ben-Nun, Tal
    Dryden, Nikoli
    Peste, Alexandra
    [J]. JOURNAL OF MACHINE LEARNING RESEARCH, 2021, 23
  • [10] Sparsity in deep learning: Pruning and growth for efficient inference and training in neural networks
    Hoefler, Torsten
    Alistarh, Dan
    Ben-Nun, Tal
    Dryden, Nikoli
    Peste, Alexandra
    [J]. Journal of Machine Learning Research, 2021, 22