Cross-View Gait Recognition Method Based on Multi-Teacher Joint Knowledge Distillation

被引:0
|
作者
Li, Ruoyu [1 ,2 ]
Yun, Lijun [1 ,2 ]
Zhang, Mingxuan [3 ]
Yang, Yanchen [1 ,2 ]
Cheng, Feiyan [1 ,2 ]
机构
[1] Yunnan Normal Univ, Coll Informat, Kunming 650500, Peoples R China
[2] Engn Res Ctr Comp Vis & Intelligent Control Techno, Dept Educ, Kunming 650500, Peoples R China
[3] Xian Inst Appl Opt, Xian 710000, Peoples R China
基金
中国国家自然科学基金;
关键词
cross-view gait recognition; multi-teacher joint knowledge distillation; resnet;
D O I
10.3390/s23229289
中图分类号
O65 [分析化学];
学科分类号
070302 ; 081704 ;
摘要
Aiming at challenges such as the high complexity of the network model, the large number of parameters, and the slow speed of training and testing in cross-view gait recognition, this paper proposes a solution: Multi-teacher Joint Knowledge Distillation (MJKD). The algorithm employs multiple complex teacher models to train gait images from a single view, extracting inter-class relationships that are then weighted and integrated into the set of inter-class relationships. These relationships guide the training of a lightweight student model, improving its gait feature extraction capability and recognition accuracy. To validate the effectiveness of the proposed Multi-teacher Joint Knowledge Distillation (MJKD), the paper performs experiments on the CASIA_B dataset using the ResNet network as the benchmark. The experimental results show that the student model trained by Multi-teacher Joint Knowledge Distillation (MJKD) achieves 98.24% recognition accuracy while significantly reducing the number of parameters and computational cost.
引用
收藏
页数:17
相关论文
共 50 条
  • [1] Multi-teacher knowledge distillation based on joint Guidance of Probe and Adaptive Corrector
    Shang, Ronghua
    Li, Wenzheng
    Zhu, Songling
    Jiao, Licheng
    Li, Yangyang
    [J]. NEURAL NETWORKS, 2023, 164 : 345 - 356
  • [2] Cross-View Gait Recognition Using Joint Bayesian
    Li, Chao
    Sun, Shouqian
    Chen, Xiaoyu
    Min, Xin
    [J]. NINTH INTERNATIONAL CONFERENCE ON DIGITAL IMAGE PROCESSING (ICDIP 2017), 2017, 10420
  • [3] Decoupled Multi-teacher Knowledge Distillation based on Entropy
    Cheng, Xin
    Tang, Jialiang
    Zhang, Zhiqiang
    Yu, Wenxin
    Jiang, Ning
    Zhou, Jinjia
    [J]. 2024 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS, ISCAS 2024, 2024,
  • [4] Anomaly detection based on multi-teacher knowledge distillation
    Ma, Ye
    Jiang, Xu
    Guan, Nan
    Yi, Wang
    [J]. JOURNAL OF SYSTEMS ARCHITECTURE, 2023, 138
  • [5] Multi-View Gait Image Generation for Cross-View Gait Recognition
    Chen, Xin
    Luo, Xizhao
    Weng, Jian
    Luo, Weiqi
    Li, Huiting
    Tian, Qi
    [J]. IEEE TRANSACTIONS ON IMAGE PROCESSING, 2021, 30 : 3041 - 3055
  • [6] Multi-teacher knowledge distillation for compressed video action recognition based on deep learning
    Wu, Meng-Chieh
    Chiu, Ching-Te
    [J]. JOURNAL OF SYSTEMS ARCHITECTURE, 2020, 103
  • [7] Cross-View Gait Recognition Method Based on Multi-branch Residual Deep Network
    Hu, Shaohui
    Wang, Xiuhui
    Liu, Yanqiu
    [J]. Moshi Shibie yu Rengong Zhineng/Pattern Recognition and Artificial Intelligence, 2021, 34 (05): : 455 - 462
  • [8] Reinforced Multi-Teacher Selection for Knowledge Distillation
    Yuan, Fei
    Shou, Linjun
    Pei, Jian
    Lin, Wutao
    Gong, Ming
    Fu, Yan
    Jiang, Daxin
    [J]. THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 14284 - 14291
  • [9] Correlation Guided Multi-teacher Knowledge Distillation
    Shi, Luyao
    Jiang, Ning
    Tang, Jialiang
    Huang, Xinlei
    [J]. NEURAL INFORMATION PROCESSING, ICONIP 2023, PT IV, 2024, 14450 : 562 - 574
  • [10] mKDNAD: A network flow anomaly detection method based on multi-teacher knowledge distillation
    Yang, Yang
    Liu, Dan
    [J]. 2022 16TH IEEE INTERNATIONAL CONFERENCE ON SIGNAL PROCESSING (ICSP2022), VOL 1, 2022, : 314 - 319