Human 3D Pose Estimation with a Tilting Camera for Social Mobile Robot Interaction

被引:20
|
作者
Garcia-Salguero, Mercedes [1 ]
Gonzalez-Jimenez, Javier [1 ]
Moreno, Francisco-Angel [1 ]
机构
[1] Univ Malaga, Machine Percept & Intelligent Robot Grp MAPIR, Dept Syst Engn & Automat, Biomed Res Inst Malaga IBIMA, E-29071 Malaga, Spain
基金
欧盟地平线“2020”;
关键词
human body pose estimation; 3D computer vision; camera pose calibration; human-robot interaction; OpenPose; RGB-D cameras; MODELS;
D O I
10.3390/s19224943
中图分类号
O65 [分析化学];
学科分类号
070302 ; 081704 ;
摘要
Human-Robot interaction represents a cornerstone of mobile robotics, especially within the field of social robots. In this context, user localization becomes of crucial importance for the interaction. This work investigates the capabilities of wide field-of-view RGB cameras to estimate the 3D position and orientation (i.e., the pose) of a user in the environment. For that, we employ a social robot endowed with a fish-eye camera hosted in a tilting head and develop two complementary approaches: (1) a fast method relying on a single image that estimates the user pose from the detection of their feet and does not require either the robot or the user to remain static during the reconstruction; and (2) a method that takes some views of the scene while the camera is being tilted and does not need the feet to be visible. Due to the particular setup of the tilting camera, special equations for 3D reconstruction have been developed. In both approaches, a CNN-based skeleton detector (OpenPose) is employed to identify humans within the image. A set of experiments with real data validate our two proposed methods, yielding similar results than commercial RGB-D cameras while surpassing them in terms of coverage of the scene (wider FoV and longer range) and robustness to light conditions.
引用
收藏
页数:22
相关论文
共 50 条
  • [31] Adapted human pose: monocular 3D human pose estimation with zero real 3D pose data
    Shuangjun Liu
    Naveen Sehgal
    Sarah Ostadabbas
    Applied Intelligence, 2022, 52 : 14491 - 14506
  • [32] On the Robustness of 3D Human Pose Estimation
    Chen, Zerui
    Huang, Yan
    Wang, Liang
    2020 25TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2021, : 5326 - 5332
  • [33] Overview of 3D Human Pose Estimation
    Lin, Jianchu
    Li, Shuang
    Qin, Hong
    Wang, Hongchang
    Cui, Ning
    Jiang, Qian
    Jian, Haifang
    Wang, Gongming
    CMES-COMPUTER MODELING IN ENGINEERING & SCIENCES, 2023, 134 (03): : 1621 - 1651
  • [34] SlowFastFormer for 3D human pose estimation
    Zhou, Lu
    Chen, Yingying
    Wang, Jinqiao
    COMPUTER VISION AND IMAGE UNDERSTANDING, 2024, 243
  • [35] A cross-feature interaction network for 3D human pose estimation
    Peng, Jihua
    Zhou, Yanghong
    Mok, P. Y.
    PATTERN RECOGNITION LETTERS, 2025, 189 : 175 - 181
  • [36] Human 3D sitting pose estimation based on contact interaction perception
    Zhou J.
    Cai J.
    Zhang L.
    Li L.
    Li X.
    Yi Qi Yi Biao Xue Bao/Chinese Journal of Scientific Instrument, 2022, 43 (11): : 132 - 141
  • [37] A Human-robot Interaction Interface for Mobile and Stationary Robots based on Real-time 3D Human Body and Hand-finger Pose Estimation
    Ehlers, Kristian
    Brama, Konstantin
    2016 IEEE 21ST INTERNATIONAL CONFERENCE ON EMERGING TECHNOLOGIES AND FACTORY AUTOMATION (ETFA), 2016,
  • [38] 3D Human Pose Estimation=2D Pose Estimation plus Matching
    Chen, Ching-Hang
    Ramanan, Deva
    30TH IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2017), 2017, : 5759 - 5767
  • [39] Monocular Weakly-Supervised Camera-Relative 3D Human Pose Estimation
    Christidis, Anestis
    Papaioannidis, Christos
    Pitas, Ioannis
    2022 IEEE 14TH IMAGE, VIDEO, AND MULTIDIMENSIONAL SIGNAL PROCESSING WORKSHOP (IVMSP), 2022,
  • [40] Fusion of Inertial Sensor Suit and Monocular Camera for 3D Human Pelvis Pose Estimation
    Popescu, Mihaela
    Shinde, Kashmira
    Sharma, Proneet
    Gutzeit, Lisa
    Kirchner, Frank
    2024 33RD IEEE INTERNATIONAL CONFERENCE ON ROBOT AND HUMAN INTERACTIVE COMMUNICATION, ROMAN 2024, 2024, : 160 - 167