From Facial Expression Recognition to Interpersonal Relation Prediction

被引:167
|
作者
Zhang, Zhanpeng [1 ]
Luo, Ping [2 ]
Loy, Chen Change [2 ]
Tang, Xiaoou [2 ]
机构
[1] SenseTime Grp Ltd, Shatin, Hong Kong, Peoples R China
[2] Chinese Univ Hong Kong, Dept Informat Engn, Shatin, Hong Kong, Peoples R China
关键词
Facial expression recognition; Interpersonal relation; Deep convolutional network; IMAGES;
D O I
10.1007/s11263-017-1055-1
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Interpersonal relation defines the association, e.g., warm, friendliness, and dominance, between two or more people. We investigate if such fine-grained and high-level relation traits can be characterized and quantified from face images in the wild. We address this challenging problem by first studying a deep network architecture for robust recognition of facial expressions. Unlike existing models that typically learn from facial expression labels alone, we devise an effective multitask network that is capable of learning from rich auxiliary attributes such as gender, age, and head pose, beyond just facial expression data. While conventional supervised training requires datasets with complete labels (e.g., all samples must be labeled with gender, age, and expression), we show that this requirement can be relaxed via a novel attribute propagation method. The approach further allows us to leverage the inherent correspondences between heterogeneous attribute sources despite the disparate distributions of different datasets. With the network we demonstrate state-of-the-art results on existing facial expression recognition benchmarks. To predict inter-personal relation, we use the expression recognition network as branches for a Siamese model. Extensive experiments show that our model is capable of mining mutual context of faces for accurate fine-grained interpersonal prediction.
引用
收藏
页码:550 / 569
页数:20
相关论文
共 50 条
  • [1] From Facial Expression Recognition to Interpersonal Relation Prediction
    Zhanpeng Zhang
    Ping Luo
    Chen Change Loy
    Xiaoou Tang
    International Journal of Computer Vision, 2018, 126 : 550 - 569
  • [2] Relation-Aware Facial Expression Recognition
    Xia, Yifan
    Yu, Hui
    Wang, Xiao
    Jian, Muwei
    Wang, Fei-Yue
    IEEE TRANSACTIONS ON COGNITIVE AND DEVELOPMENTAL SYSTEMS, 2022, 14 (03) : 1143 - 1154
  • [3] Facial Image expression recognition and prediction system
    Talukder, Animesh
    Ghosh, Surath
    SCIENTIFIC REPORTS, 2024, 14 (01):
  • [4] Research on Facial Expression Recognition Characteristics for Interpersonal Distance Using VR
    Ogoshi Y.
    Minamikawa N.
    Hayashi H.
    Saito Y.
    Ogoshi S.
    IEEJ Transactions on Electronics, Information and Systems, 2024, 144 (05) : 457 - 458
  • [5] Impaired emotional facial expression recognition is associated with interpersonal problems in alcoholism
    Kornreich, C
    Philippot, P
    Foisy, ML
    Blairy, S
    Raynaud, E
    Dan, B
    Hess, U
    Noël, X
    Pelc, I
    Verbanck, P
    ALCOHOL AND ALCOHOLISM, 2002, 37 (04): : 394 - 400
  • [6] Relation and context augmentation network for facial expression recognition
    Ma, Xin
    Ma, Yingdong
    IMAGE AND VISION COMPUTING, 2022, 127
  • [7] Interpersonal relation recognition: a survey
    Hajer Guerdelli
    Claudio Ferrari
    Stefano Berretti
    Multimedia Tools and Applications, 2023, 82 : 11417 - 11439
  • [8] GLUCOSE LEVELS AND ITS RELATION TO EMPATHY AND RECOGNITION OF FACIAL EXPRESSION
    La Marca, Roberto
    Lozza, Nicla
    La Marca-Ghaemmaghami, Pearl
    Sollberger, Silja
    Langhans, Wolfgang
    Arnold, Myrtha
    Ehlert, Ulrike
    INTERNATIONAL JOURNAL OF BEHAVIORAL MEDICINE, 2018, 25 : S77 - S77
  • [9] Correction to: Interpersonal relation recognition: a survey
    Hajer Guerdelli
    Claudio Ferrari
    Stefano Berretti
    Multimedia Tools and Applications, 2023, 82 (8) : 11441 - 11441
  • [10] Facial expression recognition from video sequences
    Cohen, I
    Sebe, N
    Garg, A
    Lew, MS
    Huang, TS
    IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO, VOL I AND II, PROCEEDINGS, 2002, : A121 - A124