Thermal Gait Dataset for Deep Learning-Oriented Gait Recognition

被引:1
|
作者
Youssef, Fatma [1 ]
El-Mahdy, Ahmed [1 ,2 ]
Ogawa, Tetsuji [3 ]
Gomaa, Walid [2 ,4 ]
机构
[1] Egypt Japan Univ Sci & Technol, Dept Comp Sci & Engn, Alexandria, Egypt
[2] Alexandria Univ, Fac Engn, Alexandria, Egypt
[3] Waseda Univ, Dept Commun & Comp Engn, Tokyo, Japan
[4] Egypt Japan Univ Sci & Technol, Cyber Phys Syst Lab, Alexandria, Egypt
关键词
thermal imagery; human gait; convolutional neural networks; vision transformers; gender recognition; person verification;
D O I
10.1109/IJCNN54540.2023.10191513
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This study attempted to construct a thermal dataset of human gait in diverse environments suitable for building and evaluating sophisticated deep learning models (e.g., vision transformers) for gait recognition. Gait is a behavioral biometric to identify a person and requires no cooperation from the person, making it suitable for security and surveillance applications. For security purposes, it is desirable to be able to recognize a person in darkness or other inadequate lighting conditions, in which thermal imagery is advantageous over visible light imagery. Despite the importance of such nighttime person identification, available thermal gait datasets captured in the dark are scarce. This study, therefore, collected a relatively large set of thermal gait data in both indoor and outdoor environments with several walking styles, e.g., walking normally, walking while carrying a bag, and walking fast. This dataset was utilized in multiple gait recognition tasks, such as gender classification and person verification, using legacy convolutional neural networks (CNNs) and modern vision transformers (ViTs). Experiments using this dataset revealed the effective training method for person verification, the effectiveness of ViT on gait recognition, and the robustness of the models against the difference in walking styles; it suggests that the developed dataset enables various studies on gait recognition using state-of-the-art deep learning models.
引用
收藏
页数:8
相关论文
共 50 条
  • [31] Person Recognition Based on Deep Gait: A Survey
    Khaliluzzaman, Md.
    Uddin, Ashraf
    Deb, Kaushik
    Hasan, Md Junayed
    SENSORS, 2023, 23 (10)
  • [32] Pose-based deep gait recognition
    Sokolova, Anna
    Konushin, Anton
    IET BIOMETRICS, 2019, 8 (02) : 134 - 143
  • [33] A multi-modal dataset for gait recognition under occlusion
    Li, Na
    Zhao, Xinbo
    APPLIED INTELLIGENCE, 2023, 53 (02) : 1517 - 1534
  • [34] A multi-modal dataset for gait recognition under occlusion
    Na Li
    Xinbo Zhao
    Applied Intelligence, 2023, 53 : 1517 - 1534
  • [35] CASIA-E: A Large Comprehensive Dataset for Gait Recognition
    Song, Chunfeng
    Huang, Yongzhen
    Wang, Weining
    Wang, Liang
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2023, 45 (03) : 2801 - 2815
  • [36] Gait Recognition in the Presence of Occlusion: A New Dataset and Baseline Algorithms
    Hofmann, Martin
    Sural, Shamik
    Rigoll, Gerhard
    WSCG 2011: COMMUNICATION PAPERS PROCEEDINGS, 2011, : 99 - +
  • [37] ST-DeepGait: A Spatiotemporal Deep Learning Model for Human Gait Recognition
    Konz, Latisha
    Hill, Andrew
    Banaei-Kashani, Farnoush
    SENSORS, 2022, 22 (20)
  • [38] A survey on gait recognition against occlusion: taxonomy, dataset and methodology
    Li, Tianhao
    Ma, Weizhi
    Zheng, Yujia
    Fan, Xinchao
    Yang, Guangcan
    Wang, Lijun
    Li, Zhengping
    PeerJ Computer Science, 2024, 10
  • [39] Health & Gait: a dataset for gait-based analysis
    Jorge Zafra-Palma
    Nuria Marín-Jiménez
    José Castro-Piñero
    Magdalena Cuenca-García
    Rafael Muñoz-Salinas
    Manuel J. Marín-Jiménez
    Scientific Data, 12 (1)
  • [40] Human Gait Recognition Using Deep Learning and Improved Ant Colony Optimization
    Khan, Awais
    Khan, Muhammad Attique
    Javed, Muhammad Younus
    Alhaisoni, Majed
    Tariq, Usman
    Kadry, Seifedine
    Choi, Jung-In
    Nam, Yunyoung
    CMC-COMPUTERS MATERIALS & CONTINUA, 2022, 70 (02): : 2113 - 2130