An Integrated Platform for Live 3D Human Reconstruction and Motion Capturing

被引:55
|
作者
Alexiadis, Dimitrios S. [1 ]
Chatzitofis, Anargyros [1 ]
Zioulis, Nikolaos [1 ]
Zoidi, Olga [1 ]
Louizis, Georgios [1 ]
Zarpalas, Dimitrios [1 ]
Daras, Petros [1 ]
机构
[1] Ctr Res & Technol Hellas, Informat Technol Inst, Thessaloniki 57001, Greece
关键词
3D motion capture; 3D reconstruction; depth sensors; evaluation; Kinect; skeleton tracking; tele-immersion (TI); REAL-TIME;
D O I
10.1109/TCSVT.2016.2576922
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
The latest developments in 3D capturing, processing, and rendering provide means to unlock novel 3D application pathways. The main elements of an integrated platform, which target tele-immersion and future 3D applications, are described in this paper, addressing the tasks of real-time capturing, robust 3D human shape/appearance reconstruction, and skeleton-based motion tracking. More specifically, initially, the details of a multiple RGB-depth (RGB-D) capturing system are given, along with a novel sensors' calibration method. A robust, fast reconstruction method from multiple RGB-D streams is then proposed, based on an enhanced variation of the volumetric Fourier transform-based method, parallelized on the Graphics Processing Unit, and accompanied with an appropriate texture-mapping algorithm. On top of that, given the lack of relevant objective evaluation methods, a novel framework is proposed for the quantitative evaluation of real-time 3D reconstruction systems. Finally, a generic, multiple depth stream-based method for accurate real-time human skeleton tracking is proposed. Detailed experimental results with multi-Kinect2 data sets verify the validity of our arguments and the effectiveness of the proposed system and methodologies.
引用
收藏
页码:798 / 813
页数:16
相关论文
共 50 条
  • [31] Capturing 3D human motion from monocular images using orthogonal locality preserving projection
    Zhao, Xu
    Liu, Yuncai
    DIGITAL HUMAN MODELING, 2007, 4561 : 304 - 313
  • [32] Framework for 3D Motion Field Estimation and Reconstruction
    Zagar, Martin
    Mlinaric, Hrvoje
    Knezovic, Josip
    ANNUAL 2010/2011 OF THE CROATIAN ACADEMY OF ENGINEERING, 2012, : 168 - 183
  • [33] 3D motion reconstruction from perspective projection
    Chen, Jiashi
    Zhuang, Yueting
    Zhu, Qiang
    Pan, Yunhe
    Jisuanji Fuzhu Sheji Yu Tuxingxue Xuebao/Journal of Computer-Aided Design and Computer Graphics, 2002, 14 (11): : 1041 - 1046
  • [34] 3D Motion Reconstruction for Real-World Camera Motion
    Zhu, Yingying
    Cox, Mark
    Lucey, Simon
    2011 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2011,
  • [35] A method of 3D human-motion capture and reconstruction based on depth information
    Quan Wei
    Jiang Shan
    Han Cheng
    Zhang Yu
    Bai Lijuan
    Zhao Haimei
    2016 IEEE INTERNATIONAL CONFERENCE ON MECHATRONICS AND AUTOMATION, 2016, : 187 - 192
  • [36] Research on Multi-view 3D Reconstruction of Human Motion Based on OpenPose
    Li, Xuhui
    Cai, Cheng
    Zhou, Hengyi
    COGNITIVE COMPUTING, ICCC 2021, 2022, 12992 : 72 - 78
  • [37] Bayesian reconstruction of 3D human motion from single-camera video
    Howe, NR
    Leventon, ME
    Freeman, WT
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 12, 2000, 12 : 820 - 826
  • [38] MotioNet: 3D Human motion reconstruction from monocular video with skeleton consistency
    Shi, Mingyi
    Aberman, Kfir
    Aristidou, Andreas
    Komura, Taku
    Lischinski, Dani
    Cohen-Or, Daniel
    Chen, Baoquan
    ACM Transactions on Graphics, 2020, 40 (01):
  • [39] Automatic reconstruction of 3D human arm motion from a monocular image sequence
    Valentina Filova
    Franc Solina
    Jadran Lenarčič
    Machine Vision and Applications, 1998, 10 : 223 - 231
  • [40] MotioNet: 3D Human Motion Reconstruction from Monocular Video with Skeleton Consistency
    Shi, Mingyi
    Aberman, Kfir
    Aristidou, Andreas
    Komura, Taku
    Lischinski, Dani
    Cohen-Or, Daniel
    Chen, Baoquan
    ACM TRANSACTIONS ON GRAPHICS, 2021, 40 (01):