Latent Trees for Estimating Intensity of Facial Action Units

被引:0
|
作者
Kaltwang, Sebastian [1 ]
Todorovic, Sinisa [2 ]
Pantic, Maja [1 ]
机构
[1] Imperial Coll London, London, England
[2] Oregon State Univ, Corvallis, OR 97331 USA
基金
英国工程与自然科学研究理事会;
关键词
EXPRESSIONS;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper is about estimating intensity levels of Facial Action Units (FAUs) in videos as an important step toward interpreting facial expressions. As input features, we use locations of facial landmark points detected in video frames. To address uncertainty of input, we formulate a generative latent tree (LT) model, its inference, and novel algorithms for efficient learning of both LT parameters and structure. Our structure learning iteratively builds LT by adding either a new edge or a new hidden node to LL starting from initially independent nodes of observable features. A graph-edit operation that increases maximally the likelihood and minimally the model complexity is selected as optimal in each iteration. For FAU intensity estimation, we derive closed-form expressions of posterior marginals of all variables in LL and specify an efficient bottom-up/topdown inference. Our evaluation on the benchmark DISFA and ShoulderPain datasets, in subject-independent setting, demonstrate that we outperform the state of the art, even under significant noise in facial landmarks. Effectiveness of our structure learning is demonstrated by probabilistically sampling meaningful facial expressions from the LT
引用
收藏
页码:296 / 304
页数:9
相关论文
共 50 条
  • [1] Facial Emotion Recognition Through Detection of Facial Action Units and Their Intensity
    Borgalli R.A.
    Surve S.
    [J]. Scientific Visualization, 2022, 14 (01): : 62 - 86
  • [2] Pain Intensity Evaluation Through Facial Action Units
    Zafar, Zuhair
    Khan, Nadeem Ahmad
    [J]. 2014 22ND INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2014, : 4696 - 4701
  • [3] LAUNet: A Latent Action Units Network for Facial Expression Recognition
    Zhang, Junlin
    Hirota, Kaoru
    Dai, Yaping
    Yin, Sijie
    [J]. 2022 41ST CHINESE CONTROL CONFERENCE (CCC), 2022, : 2513 - 2518
  • [4] Regression-based intensity estimation of facial action units
    Savran, Arman
    Sankur, Bulent
    Bilge, M. Taha
    [J]. IMAGE AND VISION COMPUTING, 2012, 30 (10) : 774 - 784
  • [5] Milestone of Pain Intensity Evaluation from Facial Action Units
    Virrey, Reneiro Andal
    Caesarendra, Wahyu
    Petra, Muhammad Iskandar Bin Pg. Hj
    Abas, Emeroylariffion
    Husaini, Asmah
    Liyanage, Chandratilak De Silva
    [J]. 2019 3RD INTERNATIONAL CONFERENCE ON ELECTRICAL ENGINEERING AND COMPUTER SCIENCE (ICECOS 2019), 2019, : 55 - 57
  • [6] Facial expression animation through action units transfer in latent space
    Fan, Yachun
    Tian, Feng
    Tan, Xiaohui
    Cheng, Housen
    [J]. COMPUTER ANIMATION AND VIRTUAL WORLDS, 2020, 31 (4-5)
  • [7] Pain Classification and Intensity Estimation Through the Analysis of Facial Action Units
    Paoli, Federica
    D'Eusanio, Andrea
    Cozzi, Federico
    Patania, Sabrina
    Boccignone, Giuseppe
    [J]. IMAGE ANALYSIS AND PROCESSING - ICIAP 2023 WORKSHOPS, PT I, 2024, 14365 : 229 - 241
  • [8] Intensity Estimation of Spontaneous Facial Action Units Based on Their Sparsity Properties
    Mohammadi, Mohammad Reza
    Fatemizadeh, Emad
    Mahoor, Mohammad H.
    [J]. IEEE TRANSACTIONS ON CYBERNETICS, 2016, 46 (03) : 817 - 826
  • [9] A Unified Probabilistic Framework For Measuring The Intensity of Spontaneous Facial Action Units
    Li, Yongqiang
    Mavadati, S. Mohammad
    Mahoor, Mohammad H.
    Ji, Qiang
    [J]. 2013 10TH IEEE INTERNATIONAL CONFERENCE AND WORKSHOPS ON AUTOMATIC FACE AND GESTURE RECOGNITION (FG), 2013,
  • [10] Measuring the intensity of spontaneous facial action units with dynamic Bayesian network
    Li, Yongqiang
    Mavadati, S. Mohammad
    Mahoor, Mohammad H.
    Zhao, Yongping
    Ji, Qiang
    [J]. PATTERN RECOGNITION, 2015, 48 (11) : 3417 - 3427