Effects of camera viewing angles on tracking kinematic gait patterns using Azure Kinect, Kinect v2 and Orbbec Astra Pro v2

被引:80
|
作者
Yeung, Ling-Fung [1 ]
Yang, Zhenqun [1 ]
Cheng, Kenneth Chik-Chi [1 ]
Du, Dan [1 ,2 ]
Tong, Raymond Kai-Yu [1 ]
机构
[1] Chinese Univ Hong Kong, Dept Biomed Engn, Hong Kong, Peoples R China
[2] Sichuan Univ, Coll Comp Sci, Chengdu, Sichuan, Peoples R China
关键词
Motion capture; Depth sensor; Accuracy; Treadmill walking; Kinematics; MICROSOFT KINECT; CONCURRENT VALIDITY; POSE TRACKING; RELIABILITY; PARAMETERS; ACCURACY;
D O I
10.1016/j.gaitpost.2021.04.005
中图分类号
Q189 [神经科学];
学科分类号
071006 ;
摘要
Background: Depth sensors could be a portable, affordable, marker-less alternative to three-dimension motion capture systems for gait analysis, but the effects of camera viewing angles on their joint angle tracking performance have not been fully investigated. Research questions: This study evaluated the accuracies of three depth sensors [Azure Kinect (AK); Kinect v2 (K2); Orbbec Astra (OA)] for tracking kinematic gait patterns during treadmill walking at five camera viewing angles (0 degrees /22.5 degrees/45 degrees/67.5 degrees /90 degrees). Methods: Ten healthy subjects performed fifteen treadmill walking trials (3 speeds x 5 viewing angles) using the three depth sensors to measure joint angles in sagittal hip, frontal hip, sagittal knee, and sagittal ankle. Ten walking steps were recorded and averaged for each walking trial. Range of motion in terms of maximum and minimum joint angles measured by the depth sensors were compared with the Vicon motion capture system as the gold standard. Depth sensors tracking accuracies were compared against the Vicon reference using rootmean-square error (RMSE) on the joint angle time series. Effects of different walking speeds, viewing angles, and depth sensors on the tracking accuracy were observed using three-way repeated-measure analysis of variance (ANOVA). Results: ANOVA results on RMSE showed significant interaction effects between viewing angles and depth sensors for sagittal hip [F(8,72) = 4.404, p = 0.005] and for sagittal knee [F(8,72)=13.211, p < 0.001] joint angles. AK had better tracking performance when subjects walked at non-frontal camera viewing angles (22.5 degrees/45 degrees/67.5 degrees/90 degrees); while K2 performed better at frontal viewing angle (0 degrees). The superior tracking performance of AK compared with K2/OA might be attributed to the improved depth sensor resolution and body tracking algorithm. Significance: Researchers should be cautious about camera viewing angle when using depth sensors for kinematic gait measurements. Our results demonstrated Azure Kinect had good tracking performance of sagittal hip and sagittal knee joint angles during treadmill walking tests at non-frontal camera viewing angles.
引用
收藏
页码:19 / 26
页数:8
相关论文
共 50 条
  • [21] Kinect V2 Camera Based Vision System for Robotic Grinding
    Zhang, Qimin
    Wang, Qiang
    Wang, Ningyuan
    Xie, Jiulong
    2023 IEEE INTERNATIONAL INSTRUMENTATION AND MEASUREMENT TECHNOLOGY CONFERENCE, I2MTC, 2023,
  • [22] A metrological characterization of the Kinect V2 time-of-flight camera
    Corti, Andrea
    Giancola, Silvio
    Mainetti, Giacomo
    Sala, Remo
    ROBOTICS AND AUTONOMOUS SYSTEMS, 2016, 75 : 584 - 594
  • [23] Hybrid Marker-Based Object Tracking Using Kinect v2
    Su, Yixian
    Gao, Wenpeng
    Liu, Ziteng
    Sun, Shuo
    Fu, Yili
    IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2020, 69 (09) : 6436 - 6445
  • [24] Depth completion for kinect v2 sensor
    Wanbin Song
    Anh Vu Le
    Seokmin Yun
    Seung-Won Jung
    Chee Sun Won
    Multimedia Tools and Applications, 2017, 76 : 4357 - 4380
  • [25] Reliability and agreement of Azure Kinect and Kinect v2 depth sensors in the shoulder joint range of motion estimation
    Ozsoy, Umut
    Yildirim, Yilmaz
    Karasin, Sezen
    Sekerci, Rahime
    Suzen, Lutfiye Bikem
    JOURNAL OF SHOULDER AND ELBOW SURGERY, 2022, 31 (10) : 2049 - 2056
  • [26] Pathological Gait Classification Using Kinect v2 and Gated Recurrent Neural Networks
    Jun, Kooksung
    Lee, Yongwoo
    Lee, Sanghyub
    Lee, Deok-Won
    Kim, Mun Sang
    IEEE ACCESS, 2020, 8 : 139881 - 139891
  • [27] Multispectral Hand Recognition Using the Kinect v2 Sensor
    Samoil, S.
    Yanushkevich, S. N.
    2016 IEEE CONGRESS ON EVOLUTIONARY COMPUTATION (CEC), 2016, : 4258 - 4264
  • [28] Upper Extremity Kinematic Parameters: Reference Ranges Based on Kinect V2
    Zhao, Kunkun
    Guo, Chuan
    Bian, Haibo
    Yu, Jiyong
    Wen, Haiying
    Wang, Tong
    Zhang, Zhisheng
    2021 27TH INTERNATIONAL CONFERENCE ON MECHATRONICS AND MACHINE VISION IN PRACTICE (M2VIP), 2021,
  • [29] Comparative analysis of respiratory motion tracking using Microsoft Kinect v2 sensor
    Silverstein, Evan
    Snyder, Michael
    JOURNAL OF APPLIED CLINICAL MEDICAL PHYSICS, 2018, 19 (03): : 193 - 204
  • [30] Tracking a Real Liver Using a Virtual Liver and an Experimental Evaluation with Kinect v2
    Noborio, Hiroshi
    Watanabe, Kaoru
    Yagi, Masahiro
    Ida, Yasuhiro
    Nankaku, Shigeki
    Onishi, Katsuhiko
    Koeda, Masanao
    Kon, Masanori
    Matsui, Kosuke
    Kaibori, Masaki
    BIOINFORMATICS AND BIOMEDICAL ENGINEERING (IWBBIO 2016), 2016, 9656 : 149 - 162