Real-time approach for gait analysis using the Kinect v2 sensor for clinical assessment purpose

被引:6
|
作者
Burle, Alexandre de Queiroz [1 ,2 ]
de Gusmao Lafayette, Thiago Buarque [1 ,2 ]
Fonseca, Jose Roberto [1 ,2 ]
Teichrieb, Veronica [1 ]
Fontes Da Gama, Alana Elza [1 ,2 ]
机构
[1] Univ Fed Pernambuco UFPE, Voxar Labs, Informat Ctr, Recife, PE, Brazil
[2] Univ Fed Pernambuco UFPE, Rehabil Engn Res Grp, Biomed Engn Dept, Recife, PE, Brazil
关键词
Kinect v2; Gait Analysis; Real-time; Clinical Environment; RBG-D sensor; MOTION CAPTURE; RECOGNITION; PARAMETERS; SYSTEM;
D O I
10.1109/SVR51698.2020.00034
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Gait analysis is often performed in clinical environments to evaluate the patient's walk capability and help the rehabilitation process. In a walking test, real-time feedback is fundamental for the healthcare professional to evaluate the gait quality, but the high cost of golden standard equipment is a limiting factor. After Microsoft Kinect v2 was released, many studies were published suggesting that this RGB-D sensor could be used for clinical purposes. This research analyzed 75 different gait samples with a real-time approach to compare the results with each parameter's real measurements. Results showed that step length outcomes are accurate and stride length outcomes are precise. Regarding kinematic parameters, knee and hip angles indicate that these joints may be used for gait analysis, but feet angles were not reliable.
引用
收藏
页码:144 / 153
页数:10
相关论文
共 50 条
  • [1] Real time RULA assessment using Kinect v2 sensor
    Manghisi, Vito Modesto
    Uva, Antonio Emmanuele
    Fiorentino, Michele
    Bevilacqua, Vitoantonio
    Trotta, Gianpaolo Francesco
    Monno, Giuseppe
    [J]. APPLIED ERGONOMICS, 2017, 65 : 481 - 491
  • [2] A Real-Time Approach for Gesture Recognition using the Kinect Sensor
    Paraskevopoulos, Georgios
    Spyrou, Evaggelos
    Sgouropoulos, Dimitrios
    [J]. 9TH HELLENIC CONFERENCE ON ARTIFICIAL INTELLIGENCE (SETN 2016), 2016,
  • [3] Improving Robustness of Shoulder Gesture Recognition Using Kinect V2 Method for Real-Time Movements
    Chandrasekhar, S.
    Mhala, N. N.
    [J]. SMART INTELLIGENT COMPUTING AND APPLICATIONS, VOL 2, 2020, 160 : 31 - 40
  • [4] A 3D Deep Learning Approach for Classification of Gait Abnormalities Using Microsoft Kinect V2 Sensor
    Shoryabi, Milad
    Foroutannia, Ali
    Rowhanimanesh, Alireza
    [J]. 2021 26TH INTERNATIONAL COMPUTER CONFERENCE, COMPUTER SOCIETY OF IRAN (CSICC), 2021,
  • [5] The Accuracy of the Microsoft Kinect V2 Sensor for Human Gait Analysis. A Different Approach for Comparison with the Ground Truth
    Guffanti, Diego
    Brunete, Alberto
    Hernando, Miguel
    Rueda, Javier
    Navarro Cabello, Enrique
    [J]. SENSORS, 2020, 20 (16) : 1 - 14
  • [6] REAL-TIME MULTI-VIEW VOLUMETRIC RECONSTRUCTION OF DYNAMIC SCENES USING KINECT V2
    Satnik, Andrej
    Izquierdo, Ebroul
    [J]. 2018 - 3DTV-CONFERENCE: THE TRUE VISION - CAPTURE, TRANSMISSION AND DISPLAY OF 3D VIDEO (3DTV-CON), 2018,
  • [7] Real-time and Robust Collaborative Robot Motion Control with Microsoft Kinect® v2
    Teke, Burak
    Lanz, Minna
    Kamarainen, Joni-Kristian
    Hietanen, Antti
    [J]. 2018 14TH IEEE/ASME INTERNATIONAL CONFERENCE ON MECHATRONIC AND EMBEDDED SYSTEMS AND APPLICATIONS (MESA), 2018,
  • [8] Evaluation of lower extremity gait analysis using Kinect V2® tracking system
    Usami, Takuya
    Nishida, Kazuki
    Iguchi, Hirotaka
    Okumura, Taro
    Sakai, Hiroaki
    Ida, Ruido
    Horiba, Mitsuya
    Kashima, Shuuto
    Sahashi, Kento
    Asai, Hayato
    Nagaya, Yuko
    Murakami, Hideki
    Ueki, Yoshino
    Kuroyanagi, Gen
    [J]. SICOT-J, 2022, 8
  • [9] Multispectral Hand Recognition Using the Kinect v2 Sensor
    Samoil, S.
    Yanushkevich, S. N.
    [J]. 2016 IEEE CONGRESS ON EVOLUTIONARY COMPUTATION (CEC), 2016, : 4258 - 4264
  • [10] Depth analysis of kinect v2 sensor in different mediums
    Bhateja, Aditi
    Shrivastav, Adarsh
    Chaudhary, Himanshu
    Lall, Brejesh
    Kalra, Prem K.
    [J]. MULTIMEDIA TOOLS AND APPLICATIONS, 2022, 81 (25) : 35775 - 35800