GaitSet: Regarding Gait as a Set for Cross-View Gait Recognition

被引:0
|
作者
Chao, Hanqing [1 ]
He, Yiwei [1 ]
Zhang, Junping [1 ]
Feng, Jianfeng [2 ]
机构
[1] Fudan Univ, Shanghai Key Lab Intelligent Informat Proc, Sch Comp Sci, Shanghai 200433, Peoples R China
[2] Fudan Univ, Inst Sci & Technol Brain Inspired Intelligence, Shanghai 200433, Peoples R China
基金
中国国家自然科学基金;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
As a unique biometric feature that can be recognized at a distance, gait has broad applications in crime prevention, forensic identification and social security. To portray a gait, existing gait recognition methods utilize either a gait template, where temporal information is hard to preserve, or a gait sequence, which must keep unnecessary sequential constraints and thus loses the flexibility of gait recognition. In this paper we present a novel perspective, where a gait is regarded as a set consisting of independent frames. We propose a new network named GaitSet to learn identity information from the set. Based on the set perspective, our method is immune to permutation of frames, and can naturally integrate frames from different videos which have been filmed under different scenarios, such as diverse viewing angles, different clothes/carrying conditions. Experiments show that under normal walking conditions, our single-model method achieves an average rank-1 accuracy of 95.0% on the CASIA-B gait dataset and an 87.1% accuracy on the OU-MVLP gait dataset. These results represent new state-of-the-art recognition accuracy. On various complex scenarios, our model exhibits a significant level of robustness. It achieves accuracies of 87.2% and 70.4% on CASIA-B under bag-carrying and coat-wearing walking conditions, respectively. These outperform the existing best methods by a large margin. The method presented can also achieve a satisfactory accuracy with a small number of frames in a test sample, e.g., 82.5% on CASIA-B with only 7 frames. The source code has been released at https://github.com/AbnerHqC/GaitSet.
引用
收藏
页码:8126 / 8133
页数:8
相关论文
共 50 条
  • [32] Batch Hard Contrastive Loss and Its Application to Cross-View Gait Recognition
    Aljazaerly, Mohamad Ammar Alsherfawi
    Makihara, Yasushi
    Muramatsu, Daigo
    Yagi, Yasushi
    [J]. IEEE ACCESS, 2023, 11 : 31177 - 31187
  • [33] An aperiodic feature representation for gait recognition in cross-view scenarios for unconstrained biometrics
    Padole, Chandrashekhar
    Proenca, Hugo
    [J]. PATTERN ANALYSIS AND APPLICATIONS, 2017, 20 (01) : 73 - 86
  • [34] Multiview max-margin subspace learning for cross-view gait recognition
    Xu, Wanjiang
    Zhu, Canyan
    Wang, Ziou
    [J]. PATTERN RECOGNITION LETTERS, 2018, 107 : 75 - 82
  • [35] Graph-optimized coupled discriminant projections for cross-view gait recognition
    Wanjiang Xu
    [J]. Applied Intelligence, 2021, 51 : 8149 - 8161
  • [36] CROSS-VIEW GAIT RECOGNITION USING NON-LINEAR VIEW TRANSFORMATIONS OF SPATIOTEMPORAL FEATURES
    Khan, Muhammad Hassan
    Farid, Muhammad Shahid
    Zahoor, Maryiam
    Grzegorzek, Marcin
    [J]. 2018 25TH IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2018, : 773 - 777
  • [37] VIEW TRANSFORMATION-BASED CROSS-VIEW GAIT RECOGNITION USING TRANSFORMATION CONSISTENCY MEASURE
    Muramatsu, Daigo
    Makihara, Yasushi
    Yagi, Yasushi
    [J]. 2ND INTERNATIONAL WORKSHOP ON BIOMETRICS AND FORENSICS (IWBF2014), 2014,
  • [38] Color-mapped contour gait image for cross-view gait recognition using deep convolutional neural network
    Linda, G. Merlin
    Themozhi, G.
    Bandi, Sudheer Reddy
    [J]. INTERNATIONAL JOURNAL OF WAVELETS MULTIRESOLUTION AND INFORMATION PROCESSING, 2020, 18 (01)
  • [39] Cross-view gait recognition based on residual long short-term memory
    Wen, Junqin
    Wang, Xiuhui
    [J]. MULTIMEDIA TOOLS AND APPLICATIONS, 2021, 80 (19) : 28777 - 28788
  • [40] A Cross-View Gait Recognition Method Using Two-Way Similarity Learning
    Qi, Y. J.
    Kong, Y. P.
    Zhang, Q.
    [J]. MATHEMATICAL PROBLEMS IN ENGINEERING, 2022, 2022