A study on gait recognition using deep neural network ensemble

被引:0
|
作者
Hong S. [1 ]
Lee S. [2 ]
Lee H.
机构
[1] School of Information Technology, Sungkonghoe University
[2] Dept. of Railroad Electrical and Electronics Engineering, Korea National University of Transportation
基金
新加坡国家研究基金会;
关键词
Deep neural network; Deep neural network ensemble; Gait Energy Image (GEI); Gait Recognition; Motion Silhouette Image (MSI);
D O I
10.5370/KIEE.2020.69.7.1125
中图分类号
学科分类号
摘要
The recognition of a person from his (her) gait has been a recent focus in computer vision because of its unique advantages such as non-invasive and human friendly. Gait recognition, however, has the weakness that it is not reliable compared with other biometrics. In this paper, we applied deep neural network ensemble to the gait recognition problem. The deep neural network ensemble is a learning paradigm where a collection of deep neural networks is trained for the same task. Generally, the ensemble shows better generalization performance than a single deep neural network such as convolution neural network and recurrent neural network. To increase reliability of the gait recognition, gait energy image (GEI) and Motion silhouette image (MSI) are extracted for gait features and convolution and recurrent neural network ensemble are used for classifier. Experiments are performed with the NLPR and SOTON databases to show the efficiency of the proposed algorithm. The performance of proposed method is 4.55%, 4.85%, 2.5% and 2.43% better than single CNN, respectively in two databases. As a result we can create a recognition system with accuracy of 100%, 100%, and 94% in the NLPR database and 97.35% in the SOTON database. © 2020 Korean Institute of Electrical Engineers. All rights reserved.
引用
收藏
页码:1125 / 1130
页数:5
相关论文
共 50 条
  • [1] GaitDONet: Gait Recognition Using Deep Features Optimization and Neural Network
    Khan, Muhammad Attique
    Khan, Awais
    Alhaisoni, Majed
    Alqahtani, Abdullah
    Armghan, Ammar
    Althubiti, Sara A.
    Alenezi, Fayadh
    Mey, Senghour
    Nam, Yunyoung
    [J]. CMC-COMPUTERS MATERIALS & CONTINUA, 2023, 75 (03): : 5087 - 5103
  • [2] An Efficient Gait Recognition Based on a Selective Neural Network Ensemble
    Lee, Heesung
    Hong, Sungjun
    Kim, Euntai
    [J]. INTERNATIONAL JOURNAL OF IMAGING SYSTEMS AND TECHNOLOGY, 2008, 18 (04) : 237 - 241
  • [3] Gait Emotion Recognition Using a Bi-modal Deep Neural Network
    Bhatia, Yajury
    Bari, A. S. M. Hossain
    Gavrilovn, Marina
    [J]. ADVANCES IN VISUAL COMPUTING, ISVC 2022, PT I, 2022, 13598 : 46 - 60
  • [4] Neural network ensemble with probabilistic fusion and its application to gait recognition
    Lee, Heesung
    Hong, SungJun
    Kim, Euntai
    [J]. NEUROCOMPUTING, 2009, 72 (7-9) : 1557 - 1564
  • [5] Gait Recognition Using Convolutional Neural Network
    Sheth, Abhishek
    Sharath, Meghana
    Reddy, Sai Charan
    Sindhu, K.
    [J]. INTERNATIONAL JOURNAL OF ONLINE AND BIOMEDICAL ENGINEERING, 2023, 19 (01) : 107 - 118
  • [6] Distribution Network Connectivity Recognition Based on Ensemble Deep Neural Network
    Jiang W.
    Tang H.
    Qi H.
    Chen H.
    Chen J.
    Jiao H.
    [J]. Dianli Xitong Zidonghua/Automation of Electric Power Systems, 2020, 44 (01): : 101 - 108
  • [7] A Case Study on Scene Recognition Using an Ensemble Convolution Neural Network
    Oh, Bongjin
    Lee, Junhyeok
    [J]. 2018 20TH INTERNATIONAL CONFERENCE ON ADVANCED COMMUNICATION TECHNOLOGY (ICACT), 2018, : 351 - 353
  • [8] Gait Phase Recognition Using Deep Convolutional Neural Network with Inertial Measurement Units
    Su, Binbin
    Smith, Christian
    Farewik, Elena Gutierrez
    [J]. BIOSENSORS-BASEL, 2020, 10 (09):
  • [9] KinectGaitNet: Kinect-Based Gait Recognition Using Deep Convolutional Neural Network
    Bari, A. S. M. Hossain
    Gavrilova, Marina L.
    [J]. SENSORS, 2022, 22 (07)
  • [10] A customizable framework for multimodal emotion recognition using ensemble of deep neural network models
    Chhavi Dixit
    Shashank Mouli Satapathy
    [J]. Multimedia Systems, 2023, 29 : 3151 - 3168