Converging Evidence for the Advantage of Dynamic Facial Expressions

被引:98
|
作者
Arsalidou, Marie [1 ]
Morris, Drew [1 ]
Taylor, Margot J. [1 ]
机构
[1] Univ Toronto, Hosp Sick Children, Res Inst, Toronto, ON M5G 1X8, Canada
关键词
Dynamic facial expressions; Facial motion; fMRI; ALE meta-analysis; SUPERIOR TEMPORAL SULCUS; FUNCTIONAL NEUROANATOMY; RIGHT-HEMISPHERE; NEURAL SYSTEMS; EMOTION; PERCEPTION; METAANALYSIS; ACTIVATION; MOTION; FACES;
D O I
10.1007/s10548-011-0171-4
中图分类号
R74 [神经病学与精神病学];
学科分类号
摘要
Neuroimaging evidence suggests that dynamic facial expressions elicit greater activity than static face stimuli in brain structures associated with social cognition, interpreted as greater ecological validity. However, a quantitative meta-analysis of brain activity associated with dynamic facial expressions is lacking. The current study investigated, using three fMRI experiments, activity elicited by (a) dynamic and static happy faces, (b) dynamic and static happy and angry faces, and (c) dynamic faces and dynamic flowers. In addition, using activation likelihood estimate (ALE) meta-analysis, we determined areas concordant across published studies that (a) used dynamic faces and (b) specifically compared dynamic and static emotional faces. The middle temporal gyri (Experiment 1) and superior temporal sulci (STS; Experiment 1 and 2) were more active for dynamic than static faces. In contrasts with the baseline the amygdalae were more active for dynamic faces (Experiment 1 and 2) and the fusiform gyri were active for all conditions (all Experiments). The ALE meta-analyses revealed concordant activation in all of these regions as well as in areas associated with cognitive manipulations (inferior frontal gyri). Converging data from the experiments and the meta-analyses suggest that dynamic facial stimuli elicit increased activity in regions associated with interpretation of social signals and emotional processing.
引用
收藏
页码:149 / 163
页数:15
相关论文
共 50 条
  • [41] Converging electrophysiological evidence for a processing advantage of social over nonsocial feedback
    Daniela M. Pfabigan
    Shihui Han
    [J]. Cognitive, Affective, & Behavioral Neuroscience, 2019, 19 : 1170 - 1183
  • [42] Converging evidence Network structure effects on conventionalization of gestural referring expressions
    Richie, Russell
    Hall, Matthew L.
    Cho, Pyeong Whan
    Coppola, Marie
    [J]. LANGUAGE DYNAMICS AND CHANGE, 2020, 10 : 259 - 290
  • [43] Dissociation Between Recognition and Detection Advantage for Facial Expressions: A Meta-Analysis
    Nummenmaa, Lauri
    Calvo, Manuel G.
    [J]. EMOTION, 2015, 15 (02) : 243 - 256
  • [44] Converging neural and behavioral evidence for a rapid, generalized response to threat-related facial expressions in 3-year-old children
    Xie, Wanze
    Leppanen, Jukka M.
    Kane-Grade, Finola E.
    Nelson, Charles A.
    [J]. NEUROIMAGE, 2021, 229
  • [45] Going beyond universal expressions: investigating the visual perception of dynamic facial expressions
    Kaulard, K.
    Wallraven, C.
    Cunningham, D. W.
    Buelthoff, H. H.
    [J]. PERCEPTION, 2009, 38 : 83 - 83
  • [46] Copycat of dynamic facial expressions: Superior volitional motor control for expressions of disgust
    Recio, Guillermo
    Sommer, Werner
    [J]. NEUROPSYCHOLOGIA, 2018, 119 : 512 - 523
  • [47] Quantifying dynamic facial expressions under naturalistic conditions
    Jeganathan, Jayson
    Campbell, Megan
    Hyett, Matthew
    Parker, Gordon
    Breakspear, Michael
    [J]. ELIFE, 2022, 11
  • [48] Orientation Selectivity for Representing Dynamic Diversity of Facial Expressions
    Madokoro, H.
    Sato, K.
    [J]. JOURNAL OF COMPUTERS, 2012, 7 (09) : 2107 - 2113
  • [49] Classification of dynamic facial expressions of emotion presented briefly
    Recio, Guillermo
    Schacht, Annekathrin
    Sommer, Werner
    [J]. COGNITION & EMOTION, 2013, 27 (08) : 1486 - 1494
  • [50] Perception of dynamic changes in facial expressions of emotion in autism
    Pelphrey, KA
    Morris, JP
    McCarthy, G
    LaBar, KS
    [J]. JOURNAL OF COGNITIVE NEUROSCIENCE, 2005, : 63 - 63