TRIPLET DISTILLATION FOR DEEP FACE RECOGNITION

被引:0
|
作者
Feng, Yushu [1 ]
Wang, Huan [1 ]
Hu, Haoji [1 ]
Yu, Lu [1 ]
Wang, Wei [2 ]
Wang, Shiyan [2 ]
机构
[1] Zhejiang Univ, Coll Informat Sci & Elect Engn, Hangzhou, Peoples R China
[2] Chongqing Univ Posts & Telecommun, Chongqing, Peoples R China
关键词
Face Recognition; Knowledge Distillation; Triplet Loss; Network Compression;
D O I
暂无
中图分类号
TB8 [摄影技术];
学科分类号
0804 ;
摘要
Convolutional neural networks (CNNs) have achieved great successes in face recognition, which unfortunately comes at the cost of massive computation and storage consumption. Many compact face recognition networks are thus proposed to resolve this problem, and triplet loss is effective to further improve the performance of these compact models. However, it normally employs a fixed margin to all the samples, which neglects the informative similarity structures between different identities. In this paper, we borrow the idea of knowledge distillation and define the informative similarity as the transferred knowledge. Then, we propose an enhanced version of triplet loss, named triplet distillation, which exploits the capability of a teacher model to transfer the similarity information to a student model by adaptively varying the margin between positive and negative pairs. Experiments on the LFW, AgeDB and CPLFW datasets show the merits of our method compared to the original triplet loss.
引用
收藏
页码:808 / 812
页数:5
相关论文
共 50 条
  • [1] Evaluation-oriented Knowledge Distillation for Deep Face Recognition
    Huang, Yuge
    Wu, Jiaxiang
    Xu, Xingkun
    Ding, Shouhong
    [J]. 2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2022), 2022, : 18719 - 18728
  • [2] Training Deep Face Recognition for Efficient Inference by Distillation and Mutual Learning
    Shen, Guodong
    Shen, Yao
    RiaZ, M. Naveed
    [J]. PROCEEDINGS OF THE 2018 IEEE INTERNATIONAL CONFERENCE ON PROGRESS IN INFORMATICS AND COMPUTING (PIC), 2018, : 38 - 43
  • [3] Light Deep Face Recognition based on Knowledge Distillation and Adversarial Training
    Liu, Jinjin
    Li, Xiaonan
    [J]. 2022 INTERNATIONAL CONFERENCE ON MECHANICAL, AUTOMATION AND ELECTRICAL ENGINEERING, CMAEE, 2022, : 127 - 132
  • [4] Enhanced Knowledge Distillation for Face Recognition
    Ni, Hao
    Shen, Jie
    Yuan, Chong
    [J]. 2019 IEEE INTL CONF ON PARALLEL & DISTRIBUTED PROCESSING WITH APPLICATIONS, BIG DATA & CLOUD COMPUTING, SUSTAINABLE COMPUTING & COMMUNICATIONS, SOCIAL COMPUTING & NETWORKING (ISPA/BDCLOUD/SOCIALCOM/SUSTAINCOM 2019), 2019, : 1441 - 1444
  • [5] Deep Metric Learning with Triplet-Margin-Center Loss for Sketch Face Recognition
    Feng, Yujian
    Wu, Fei
    Ji, Yimu
    Jing, Xiao-Yuan
    Yu, Jian
    [J]. IEICE TRANSACTIONS ON INFORMATION AND SYSTEMS, 2020, E103D (11): : 2394 - 2397
  • [6] Deep Metric Learning with Triplet-Margin-Center Loss for Sketch Face Recognition
    Feng Y.
    Wu F.
    Ji Y.
    Jing X.-Y.
    Yu J.
    [J]. IEICE Transactions on Information and Systems, 2020, E103D (11): : 2394 - 2397
  • [7] A single-stage face detection and face recognition deep neural network based on feature pyramid and triplet loss
    Tsai, Tsung-Han
    Chi, Po-Ting
    [J]. IET IMAGE PROCESSING, 2022, 16 (08) : 2148 - 2156
  • [8] CoupleFace: Relation Matters for Face Recognition Distillation
    Liu, Jiaheng
    Qin, Haoyu
    Wu, Yichao
    Guo, Jinyang
    Liang, Ding
    Xu, Ke
    [J]. COMPUTER VISION, ECCV 2022, PT XII, 2022, 13672 : 683 - 700
  • [9] Cross-Architecture Distillation for Face Recognition
    Zhao, Weisong
    Zhu, Xiangyu
    He, Zhixiang
    Zhang, Xiao-Yu
    Lei, Zhen
    [J]. PROCEEDINGS OF THE 31ST ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA, MM 2023, 2023, : 8076 - 8085
  • [10] Deep Face Recognition: a Survey
    Masi, Iacopo
    Wu, Yue
    Hassner, Tal
    Natarajan, Prem
    [J]. PROCEEDINGS 2018 31ST SIBGRAPI CONFERENCE ON GRAPHICS, PATTERNS AND IMAGES (SIBGRAPI), 2018, : 471 - 478