Human gait recognition by fusing global and local image entropy features with neural networks

被引:1
|
作者
Deng, Muqing [1 ]
Sun, Yuanyou [1 ]
Fan, Zhuyao [2 ]
Feng, Xiaoreng [3 ]
机构
[1] Guangdong Univ Technol, Sch Automat, Guangdong Key Lab IoT Informat Technol, Guangzhou, Peoples R China
[2] Hangzhou Dianzi Univ, Inst Informat & Control, Hangzhou, Peoples R China
[3] Univ Hong Kong, Queen Mary Hosp, Dept Orthopaed & Traumatol, Hong Kong, Peoples R China
基金
中国国家自然科学基金;
关键词
gait recognition; image entropy; gait dynamics; feature fusion; FACE RECOGNITION;
D O I
10.1117/1.JEI.31.1.013034
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
We propose a robust gait recognition method based on the combination of global and local image entropy features. An improved feature extraction scheme is developed, in which binary walking silhouettes are characterized with global and local image entropy features. Gait dynamics underlying image entropy features are derived and fused. Additionally, pretrained deep neural networks are employed as the feature extractor on the raw fused image entropy features. The extracted gait dynamics and deep transfer learning features are finally fused and fed into a seven-layer fully connected network for the identification task. The proposed method can make use of global and local gait characteristics sufficiently, which is helpful for resisting walking conditions variation. Experiments on the CASIA-B database are conducted to demonstrate the efficiency of the proposed method. (C) 2022 SPIE and IS&T
引用
收藏
页数:15
相关论文
共 50 条
  • [41] Fusing bi-directional global-local features for single image super-resolution
    Hwang, Kyomin
    Yoon, Gangjoon
    Song, Jinjoo
    Yoon, Sang Min
    [J]. ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2024, 127
  • [42] AFpoint: adaptively fusing local and global features for point cloud
    Li, Guangping
    Liu, Chenghui
    Gao, Xiang
    Xiao, Huanling
    Ling, Bingo Wing-Kuen
    [J]. MULTIMEDIA TOOLS AND APPLICATIONS, 2024, 83 (33) : 79093 - 79115
  • [43] GAIT RECOGNITION BASED ON CONVOLUTIONAL NEURAL NETWORKS
    Sokolova, A.
    Konushin, A.
    [J]. INTERNATIONAL WORKSHOP PHOTOGRAMMETRIC AND COMPUTER VISION TECHNIQUES FOR VIDEO SURVEILLANCE, BIOMETRICS AND BIOMEDICINE, 2017, 42-2 (W4): : 207 - 212
  • [44] ConvTEBiLSTM: A Neural Network Fusing Local and Global Trajectory Features for Field-Road Mode Classification
    Bian, Cunxiang
    Bai, Jinqiang
    Cheng, Guanghe
    Hao, Fengqi
    Zhao, Xiyuan
    [J]. ISPRS INTERNATIONAL JOURNAL OF GEO-INFORMATION, 2024, 13 (03)
  • [45] Fusing Higher-Order Features in Graph Neural Networks for Skeleton-Based Action Recognition
    Qin, Zhenyue
    Liu, Yang
    Ji, Pan
    Kim, Dongwoo
    Wang, Lei
    McKay, R., I
    Anwar, Saeed
    Gedeon, Tom
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (04) : 4783 - 4797
  • [46] Research on Human Action Recognition Based on Global and Local Mixed Features
    Liu, Xueping
    Li, Yibo
    [J]. PROCEEDINGS OF THE 2014 INTERNATIONAL CONFERENCE ON MECHATRONICS, CONTROL AND ELECTRONIC ENGINEERING, 2014, 113 : 692 - 696
  • [47] Combining Neural Networks and Global Gabor Features in a Hybrid Face Recognition System
    Dumitrescu, Catalin-Mircea
    Dumitrache, Ioan
    [J]. 2019 22ND INTERNATIONAL CONFERENCE ON CONTROL SYSTEMS AND COMPUTER SCIENCE (CSCS), 2019, : 216 - 222
  • [48] Face Recognition by Global Optimal Discriminant Features and Ensemble Artificial Neural Networks
    Wang, Jingjing
    Yin, Jianqin
    [J]. 2009 INTERNATIONAL SYMPOSIUM ON COMPUTER NETWORK AND MULTIMEDIA TECHNOLOGY (CNMT 2009), VOLUMES 1 AND 2, 2009, : 261 - +
  • [49] Spatiotemporal features of human motion for gait recognition
    Khan, Muhammad Hassan
    Farid, Muhammad Shahid
    Grzegorzek, Marcin
    [J]. SIGNAL IMAGE AND VIDEO PROCESSING, 2019, 13 (02) : 369 - 377
  • [50] Human gait recognition based on Haralick features
    Lishani, Ait O.
    Boubchir, Larbi
    Khalifa, Emad
    Bouridane, Ahmed
    [J]. SIGNAL IMAGE AND VIDEO PROCESSING, 2017, 11 (06) : 1123 - 1130