Real Time Feature Based 3-D Deformable Face Tracking

被引:0
|
作者
Zhang, Wei [1 ]
Wang, Qiang [2 ]
Tang, Xiaoou [1 ]
机构
[1] Chinese Univ Hong Kong, Dept Informat Engn, Hong Kong, Hong Kong, Peoples R China
[2] Microsoft Res Asia, Beijing, Peoples R China
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper, we develop a novel framework for 3D tracking non-rigid face deformation from a single camera. The difficulty of the problem lies in the fact that 3D deformation parameter estimation becomes unstable when there are few reliable facial features correspondences. Unfortunately, this often occurs in real tracking scenario when there is significant illumination change, motion blur or large pose variation. In order to extract more information of feature correspondences, the proposed framework integrates three types of features which discriminate face deformation across different views: 1) the semantic features which provide constant correspondences between 3D model points and major facial features; 2) the silhouette features which provide dynamic correspondences between 3D model points and facial silhouette under varying views; 3) the online tracking features that provide redundant correspondences between 3D model points and salient image features. The integration of these complementary features is important for robust estimation of the 3D parameters. In order to estimate the high dimensional 3D deformation parameters, we develop a hierarchical parameter estimation algorithm to robustly estimate both rigid and non-rigid 3D parameters. We show the importance of both features fusion and hierarchical parameter estimation for reliable tracking 3D face deformation. Experiments demonstrate the robustness and accuracy of the proposed algorithm especially in the cases of agile head motion,
引用
收藏
页码:720 / +
页数:4
相关论文
共 50 条
  • [41] Implementation of a modular real-time feature-based architecture applied to visual face tracking
    Castañeda, B
    Luzanov, Y
    Cockburn, JC
    PROCEEDINGS OF THE 17TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION, VOL 4, 2004, : 167 - 170
  • [42] VIEW-BASED APPEARANCE MODEL ONLINE LEARNING FOR 3D DEFORMABLE FACE TRACKING
    Lefevre, Stephanie
    Odobez, Jean-Marc
    VISAPP 2010: PROCEEDINGS OF THE INTERNATIONAL CONFERENCE ON COMPUTER VISION THEORY AND APPLICATIONS, VOL 1, 2010, : 223 - 230
  • [43] A feature based method for tracking the 3-D trajectory and the orientation of a signer's hand
    Rezaei, Ahmadreza
    Vafadoost, Mansur
    Rezaei, Sadegh
    Shekofteh, Yasser
    2008 INTERNATIONAL CONFERENCE ON COMPUTER AND COMMUNICATION ENGINEERING, VOLS 1-3, 2008, : 346 - +
  • [44] Real-time tracking of deformable objects based on combined matching-and-tracking
    Yan, Junhua
    Wang, Zhigang
    Wang, Shunfei
    JOURNAL OF ELECTRONIC IMAGING, 2016, 25 (02)
  • [45] Real-time tracking of deformable objects based on MOK algorithm
    Yan, Junhua
    Wang, Zhigang
    Wang, Shunfei
    JOURNAL OF SYSTEMS ENGINEERING AND ELECTRONICS, 2016, 27 (02) : 477 - 483
  • [46] Real-time tracking of deformable objects based on MOK algorithm
    Junhua Yan
    Zhigang Wang
    Shunfei Wang
    Journal of Systems Engineering and Electronics, 2016, 27 (02) : 477 - 483
  • [47] Real-time 3D face tracking based on active appearance model constrained by depth data
    Smolyanskiy, Nikolai
    Huitema, Christian
    Liang, Lin
    Anderson, Sean Eron
    IMAGE AND VISION COMPUTING, 2014, 32 (11) : 860 - 869
  • [48] Real-time 3D Face Reconstruction and Gaze Tracking for Virtual Reality
    Chen, Shu-Yu
    Gao, Lin
    Lai, Yu-Kun
    Rosin, Paul L.
    Xia, Shihong
    25TH 2018 IEEE CONFERENCE ON VIRTUAL REALITY AND 3D USER INTERFACES (VR), 2018, : 525 - 526
  • [49] Real-time 3D face tracking with mutual information and active contours
    Panin, Giorgio
    Knoll, Alois
    ADVANCES IN VISUAL COMPUTING, PT I, 2007, 4841 : 1 - 12
  • [50] 3D face pose estimation by a robust real time tracking of facial features
    Junchul Chun
    Wonggi Kim
    Multimedia Tools and Applications, 2016, 75 : 15693 - 15708