The widespread action observation/execution matching system for facial expression processing

被引:5
|
作者
Sato, Wataru [1 ,4 ]
Kochiyama, Takanori [2 ]
Yoshikawa, Sakiko [3 ]
机构
[1] RIKEN, Guardian Robot Project, Psychol Proc Res Team, Kyoto, Japan
[2] ATR Promot, Brain Act Imaging Ctr, Kyoto, Japan
[3] Kyoto Univ Arts, Kyoto, Japan
[4] RIKEN, Guardian Robot Project, Psychol Proc Res Team, 2-2-2 Hikaridai, Seika, Kyoto 6190288, Japan
基金
日本科学技术振兴机构;
关键词
amygdala; cerebellum; dynamic facial expressions of emotion; facial nerve nucleus; group independent component analysis (ICA); mirror neuron system; MIRROR NEURON SYSTEM; CERVICAL-SPINAL-CORD; HUMAN BRAIN; PREMOTOR CORTEX; MOTOR; FMRI; PERCEPTION; RESPONSES; MIMICRY; FACE;
D O I
10.1002/hbm.26262
中图分类号
Q189 [神经科学];
学科分类号
071006 ;
摘要
Observing and understanding others' emotional facial expressions, possibly through motor synchronization, plays a primary role in face-to-face communication. To understand the underlying neural mechanisms, previous functional magnetic resonance imaging (fMRI) studies investigated brain regions that are involved in both the observation/execution of emotional facial expressions and found that the neocortical motor regions constituting the action observation/execution matching system or mirror neuron system were active. However, it remains unclear (1) whether other brain regions in the limbic, cerebellum, and brainstem regions could be also involved in the observation/execution matching system for processing facial expressions, and (2) if so, whether these regions could constitute a functional network. To investigate these issues, we performed fMRI while participants observed dynamic facial expressions of anger and happiness and while they executed facial muscle activity associated with angry and happy facial expressions. Conjunction analyses revealed that, in addition to neocortical regions (i.e., the right ventral premotor cortex and right supplementary motor area), bilateral amygdala, right basal ganglia, bilateral cerebellum, and right facial nerve nucleus were activated during both the observation/execution tasks. Group independent component analysis revealed that a functional network component involving the aforementioned regions were activated during both observation/execution tasks. The data suggest that the motor synchronization of emotional facial expressions involves a widespread observation/execution matching network encompassing the neocortex, limbic system, basal ganglia, cerebellum, and brainstem.
引用
收藏
页码:3057 / 3071
页数:15
相关论文
共 50 条
  • [31] Motor priming by movement observation with contralateral concurrent action execution
    Itaguchi, Yoshihiro
    Kaneko, Fuminari
    HUMAN MOVEMENT SCIENCE, 2018, 57 : 94 - 102
  • [32] Functional Imaging of the Parietal Cortex during Action Execution and Observation
    Evangeliou, Mina N.
    Raos, Vassilis
    Galletti, Claudio
    Savaki, Helen E.
    CEREBRAL CORTEX, 2009, 19 (03) : 624 - 639
  • [33] Software Testing for the CTA Observation Execution System
    Murach, Thomas
    Zagar, Anze
    Leben, Urban
    Oya, Igor
    Fuessling, Matthias
    Dezman, Dejan
    Conforti, Vito
    Krack, Fabian
    Lyard, Etienne
    Melkumyan, David
    Mosshammer, Klemens
    Sadeh, Iftach
    Schmidt, Torsten
    Schwanke, Ullrich
    Schwarz, Joseph
    Wiesand, Stephan
    SOFTWARE AND CYBERINFRASTRUCTURE FOR ASTRONOMY V, 2018, 10707
  • [34] A Psychometric Evaluation of the Facial Action Coding System for Assessing Spontaneous Expression
    Michael A. Sayette
    Jeffrey F. Cohn
    Joan M. Wertz
    Michael A. Perrott
    Dominic J. Parrott
    Journal of Nonverbal Behavior, 2001, 25 : 167 - 185
  • [35] A psychometric evaluation of the facial action coding system for assessing spontaneous expression
    Sayette, MA
    Cohn, JF
    Wertz, JM
    Perrott, MA
    Parrott, DJ
    JOURNAL OF NONVERBAL BEHAVIOR, 2001, 25 (03) : 167 - 185
  • [36] Facial Expression Recognition Based on Facial Action Unit
    Yang, Jiannan
    Zhang, Fan
    Chen, Bike
    Khan, Samee U.
    2019 TENTH INTERNATIONAL GREEN AND SUSTAINABLE COMPUTING CONFERENCE (IGSC), 2019,
  • [37] Facial Expression Manipulation for Personalized Facial Action Estimation
    Niinuma, Koichiro
    Onal Ertugrul, Itir
    Cohn, Jeffrey F.
    Jeni, Laszlo A.
    FRONTIERS IN SIGNAL PROCESSING, 2022, 2
  • [38] Objective evaluation of the relationship between facial expression analysis by the facial action coding system (FACS) and CT/MRI analyses of the facial expression muscles
    Okuda, Itsuko
    Yamakawa, Yumika
    Mitani, Nobu
    Ota, Naoko
    Kawabata, Marie
    Yoshioka, Naoki
    SKIN RESEARCH AND TECHNOLOGY, 2020, 26 (05) : 727 - 733
  • [39] FACIAL EXPRESSION RECOGNITION BASED ON DIFFEOMORPHIC MATCHING
    Yousefi, Siamak
    Minh Phuoc Nguyen
    Kehtarnavaz, Nasser
    Cao, Yan
    2010 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING, 2010, : 4549 - 4552
  • [40] Widespread and lateralized social brain activity for processing dynamic facial expressions
    Sato, Wataru
    Kochiyama, Takanori
    Uono, Shota
    Sawada, Reiko
    Kubota, Yasutaka
    Yoshimura, Sayaka
    Toichi, Motomi
    HUMAN BRAIN MAPPING, 2019, 40 (13) : 3753 - 3768