Facial Expression Ambiguity and Face Image Quality Affect Differently on Expression Interpretation Bias

被引:4
|
作者
Kinchella, Jade [1 ]
Guo, Kun [1 ]
机构
[1] Univ Lincoln, Lincoln, England
关键词
facial expression; expression ambiguity; image resolution; expression categorisation; gaze behaviour;
D O I
10.1177/03010066211000270
中图分类号
R77 [眼科学];
学科分类号
100212 ;
摘要
We often show an invariant or comparable recognition performance for perceiving prototypical facial expressions, such as happiness and anger, under different viewing settings. However, it is unclear to what extent the categorisation of ambiguous expressions and associated interpretation bias are invariant in degraded viewing conditions. In this exploratory eye-tracking study, we systematically manipulated both facial expression ambiguity (via morphing happy and angry expressions in different proportions) and face image clarity/quality (via manipulating image resolution) to measure participants' expression categorisation performance, perceived expression intensity, and associated face-viewing gaze distribution. Our analysis revealed that increasing facial expression ambiguity and decreasing face image quality induced the opposite direction of expression interpretation bias (negativity vs. positivity bias, or increased anger vs. increased happiness categorisation), the same direction of deterioration impact on rating expression intensity, and qualitatively different influence on face-viewing gaze allocation (decreased gaze at eyes but increased gaze at mouth vs. stronger central fixation bias). These novel findings suggest that in comparison with prototypical facial expressions, our visual system has less perceptual tolerance in processing ambiguous expressions which are subject to viewing condition-dependent interpretation bias.
引用
收藏
页码:328 / 342
页数:15
相关论文
共 50 条
  • [1] The interpretation of facial expression
    Buzby, DE
    [J]. AMERICAN JOURNAL OF PSYCHOLOGY, 1924, 35 : 602 - 604
  • [2] THE INTERPRETATION OF FACIAL EXPRESSION IN EMOTION
    Landis, Carney
    [J]. JOURNAL OF GENERAL PSYCHOLOGY, 1929, 2 (01): : 59 - 70
  • [3] Face-Cap: Image Captioning Using Facial Expression Analysis
    Nezami, Omid Mohamad
    Dras, Mark
    Anderson, Peter
    Hamey, Len
    [J]. MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, ECML PKDD 2018, PT I, 2019, 11051 : 226 - 240
  • [4] Frustratingly Easy Personalization for Real-time Affect Interpretation of Facial Expression
    Spaulding, Samuel
    Breazeal, Cynthia
    [J]. 2019 8TH INTERNATIONAL CONFERENCE ON AFFECTIVE COMPUTING AND INTELLIGENT INTERACTION (ACII), 2019,
  • [5] Automated Facial Expression Classification and Affect Interpretation Using Infrared Measurement of Facial Skin Temperature Variations
    Khan, Masood Mehmood
    Ingleby, Michael
    Ward, Robert D.
    [J]. ACM TRANSACTIONS ON AUTONOMOUS AND ADAPTIVE SYSTEMS, 2006, 1 (01) : 91 - 113
  • [6] Affect display effects of anger by virtual facial image synthesis with facial color enhancement and expression
    Yamada, Takashi
    Watanabe, Tomio
    [J]. Nihon Kikai Gakkai Ronbunshu, C Hen/Transactions of the Japan Society of Mechanical Engineers, Part C, 2007, 73 (09): : 2543 - 2550
  • [7] Design of a Face Robot with Facial Expression
    Lin, Chia-Chen
    Huang, Han-Pang
    [J]. 2009 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND BIOMIMETICS (ROBIO 2009), VOLS 1-4, 2009, : 492 - 497
  • [8] THE WONDERFUL FACE + FACIAL EXPRESSION IN FILM
    MEGAHEY, L
    [J]. SIGHT AND SOUND, 1993, 3 (07): : 31 - 31
  • [9] Face identification using facial expression
    Zhang, Lifeng
    Korekoda, Keisuke
    Yang, Zhimei
    Koda, Takaharu
    Kondo, Hiroshi
    [J]. 2006 INTERNATIONAL SYMPOSIUM ON COMMUNICATIONS AND INFORMATION TECHNOLOGIES,VOLS 1-3, 2006, : 581 - +
  • [10] Facial Expression Recognition from a Partial Face Image by Using Displacement Vector
    Theekapun, Charoenpong
    Tokai, Shogo
    Hase, Hiroyuki
    [J]. ECTI-CON 2008: PROCEEDINGS OF THE 2008 5TH INTERNATIONAL CONFERENCE ON ELECTRICAL ENGINEERING/ELECTRONICS, COMPUTER, TELECOMMUNICATIONS AND INFORMATION TECHNOLOGY, VOLS 1 AND 2, 2008, : 441 - 444