Facial Expression Recognition via Deep Action Units Graph Network Based on Psychological Mechanism

被引:46
|
作者
Liu, Yang [1 ]
Zhang, Xingming [1 ]
Lin, Yubei [2 ]
Wang, Haoxiang [1 ]
机构
[1] South China Univ Technol, Sch Comp Sci & Engn, Guangzhou 510006, Peoples R China
[2] South China Univ Technol, Sch Software Engn, Guangzhou 510006, Peoples R China
关键词
Psychology; Feature extraction; Gold; Face recognition; Task analysis; Correlation; Face; Action units (AUs); deep graph-based network; facial expression recognition (FER); facial graph representation; psychological mechanism; FACE;
D O I
10.1109/TCDS.2019.2917711
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Facial expression recognition (FER) is currently a very attractive research field in cognitive psychology and artificial intelligence. In this paper, an innovative FER algorithm called deep action units graph network (DAUGN) is proposed based on psychological mechanism. First, a segmentation method is designed to divide the face into small key areas, which are then converted into corresponding AU-related facial expression regions. Second, the local appearance features of these critical regions are extracted for further action units (AUs) analysis. Then, an AUs facial graph is constructed to represent expressions by taking the AU-related regions as vertices and the distances between each two landmarks as edges. Finally, the adjacency matrices of facial graph are put into a graph-based convolutional neural network to combine the local-appearance and global-geometry information, which greatly improving the performance of FER. Experiments and comparisons on CK+, MMI, and SFEW data sets reveal that the DAUGN achieves more competitive results than several other popular approaches.
引用
收藏
页码:311 / 322
页数:12
相关论文
共 50 条
  • [1] LAUNet: A Latent Action Units Network for Facial Expression Recognition
    Zhang, Junlin
    Hirota, Kaoru
    Dai, Yaping
    Yin, Sijie
    [J]. 2022 41ST CHINESE CONTROL CONFERENCE (CCC), 2022, : 2513 - 2518
  • [2] Automated facial expression recognition based on FACS action units
    Lien, JJ
    Kanade, T
    Cohn, JF
    Li, CC
    [J]. AUTOMATIC FACE AND GESTURE RECOGNITION - THIRD IEEE INTERNATIONAL CONFERENCE PROCEEDINGS, 1998, : 390 - 395
  • [3] Facial Action Units-Based Image Retrieval for Facial Expression Recognition
    Trinh Thi Doan Pham
    Kim, Sesong
    Lu, Yucheng
    Jung, Seung-Won
    Won, Chee-Sun
    [J]. IEEE ACCESS, 2019, 7 : 5200 - 5207
  • [4] Facial Expression Recognition via a Boosted Deep Belief Network
    Liu, Ping
    Han, Shizhong
    Meng, Zibo
    Tong, Yan
    [J]. 2014 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2014, : 1805 - 1812
  • [5] Facial expression recognition based on bidirectional gated recurrent units within deep residual network
    Shen, Wenjuan
    Li, Xiaoling
    [J]. INTERNATIONAL JOURNAL OF INTELLIGENT COMPUTING AND CYBERNETICS, 2020, 13 (04) : 527 - 543
  • [6] FERGCN: facial expression recognition based on graph convolution network
    Liao, Lei
    Zhu, Yu
    Zheng, Bingbing
    Jiang, Xiaoben
    Lin, Jiajun
    [J]. MACHINE VISION AND APPLICATIONS, 2022, 33 (03)
  • [7] FERGCN: facial expression recognition based on graph convolution network
    Lei Liao
    Yu Zhu
    Bingbing Zheng
    Xiaoben Jiang
    Jiajun Lin
    [J]. Machine Vision and Applications, 2022, 33
  • [8] Facial Expression Recognition Network Based on Attention Mechanism
    Zhang, Wei
    Li, Pu
    [J]. Tianjin Daxue Xuebao (Ziran Kexue yu Gongcheng Jishu Ban)/Journal of Tianjin University Science and Technology, 2022, 55 (07): : 706 - 713
  • [9] Facial micro-expression recognition based on motion magnification network and graph attention mechanism
    Wu, Falin
    Xia, Yu
    Hu, Tiangyang
    Ma, Boyi
    Yang, Jingyao
    Li, Haoxin
    [J]. HELIYON, 2024, 10 (16)
  • [10] Automatic stress analysis from facial videos based on deep facial action units recognition
    Giorgos Giannakakis
    Mohammad Rami Koujan
    Anastasios Roussos
    Kostas Marias
    [J]. Pattern Analysis and Applications, 2022, 25 : 521 - 535