Facial expressions contribute more than body movements to conversational outcomes in avatar-mediated virtual environments

被引:0
|
作者
Catherine Oh Kruzic
David Kruzic
Fernanda Herrera
Jeremy Bailenson
机构
[1] Stanford University,Virtual Human Interaction Lab, Department of Communication
来源
关键词
D O I
暂无
中图分类号
学科分类号
摘要
This study focuses on the individual and joint contributions of two nonverbal channels (i.e., face and upper body) in avatar mediated-virtual environments. 140 dyads were randomly assigned to communicate with each other via platforms that differentially activated or deactivated facial and bodily nonverbal cues. The availability of facial expressions had a positive effect on interpersonal outcomes. More specifically, dyads that were able to see their partner’s facial movements mapped onto their avatars liked each other more, formed more accurate impressions about their partners, and described their interaction experiences more positively compared to those unable to see facial movements. However, the latter was only true when their partner’s bodily gestures were also available and not when only facial movements were available. Dyads showed greater nonverbal synchrony when they could see their partner’s bodily and facial movements. This study also employed machine learning to explore whether nonverbal cues could predict interpersonal attraction. These classifiers predicted high and low interpersonal attraction at an accuracy rate of 65%. These findings highlight the relative significance of facial cues compared to bodily cues on interpersonal outcomes in virtual environments and lend insight into the potential of automatically tracked nonverbal cues to predict interpersonal attitudes.
引用
收藏
相关论文
共 6 条
  • [1] Facial expressions contribute more than body movements to conversational outcomes in avatar-mediated virtual environments
    Kruzic, Catherine Oh
    Kruzic, David
    Herrera, Fernanda
    Bailenson, Jeremy
    [J]. SCIENTIFIC REPORTS, 2020, 10 (01)
  • [2] Investigating Avatar Facial Expressions and Collaboration Dynamics for Social Presence in Avatar-Mediated XR Remote Communication
    Kang, Seoyoung
    [J]. 2024 IEEE CONFERENCE ON VIRTUAL REALITY AND 3D USER INTERFACES ABSTRACTS AND WORKSHOPS, VRW 2024, 2024, : 1114 - 1115
  • [3] Let the Avatar Brighten Your Smile: Effects of Enhancing Facial Expressions in Virtual Environments
    Oh, Soo Youn
    Bailenson, Jeremy
    Kraemer, Nicole
    Li, Benjamin
    [J]. PLOS ONE, 2016, 11 (09):
  • [4] Embodied Realistic Avatar System with Body Motions and Facial Expressions for Communication in Virtual Reality Applications
    Aseeri, Sahar
    Marin, Sebastian
    Landers, Richard N.
    Interrante, Victoria
    Rosenberg, Evan S.
    [J]. 2020 IEEE CONFERENCE ON VIRTUAL REALITY AND 3D USER INTERFACES WORKSHOPS (VRW 2020), 2020, : 581 - 582
  • [5] Impact of avatar facial anthropomorphism on body ownership, attractiveness and social presence in collaborative tasks in immersive virtual environments
    Dubosc, Charlotte
    Gorisse, Geoffrey
    Christmann, Olivier
    Fleury, Sylvain
    Poinsot, Killian
    Richir, Simon
    [J]. COMPUTERS & GRAPHICS-UK, 2021, 101 : 82 - 92
  • [6] Artificial Intelligence Can Recognize Whether a Job Applicant Is Selling and/or Lying According to Facial Expressions and Head Movements Much More Correctly Than Human Interviewers
    Suen, Hung-Yue
    Hung, Kuo-En
    Liu, Che-Wei
    Su, Yu-Sheng
    Fan, Han-Chih
    [J]. IEEE TRANSACTIONS ON COMPUTATIONAL SOCIAL SYSTEMS, 2024, 11 (05) : 5949 - 5960