Synthesizing facial expressions in dyadic human-robot interaction

被引:0
|
作者
Sham, Abdallah Hussein [1 ]
Tikka, Pia [1 ]
Lamas, David [2 ]
Anbarjafari, Gholamreza [3 ,4 ,5 ,6 ]
机构
[1] Tallinn Univ, Balt Film Media & Arts Sch, Enact Virtual Lab, Narva Mnt 25, EE-10120 Tallinn, Estonia
[2] Tallinn Univ, Digital Technol Inst, Narva Mnt 25, EE-10120 Tallinn, Estonia
[3] iVCV OU, EE-51011 Tartu, Estonia
[4] Univ Tartu, iCV Lab, Tartu, Estonia
[5] Yildiz Tech Univ, Inst Higher Educ, Istanbul, Turkiye
[6] PwC Advisory, Helsinki, Finland
关键词
Facial reaction emotion synthesis; Emotion recognition; Emotion reaction; Responsible AI; EMOTION RECOGNITION;
D O I
10.1007/s11760-024-03202-4
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Generative artificial intelligence (GenAI) can be used to create facial expressions of artificial human characters in real time based on the training dataset. However, the bottleneck that prevents natural dyadic interaction between an artificial character and a human lies in the GenAI's limited capability to recognize dynamically changing contexts. To tackle this issue, we investigated how deep learning (DL) techniques could synthesize facial reaction emotions based on a sequence of the previous emotions. We applied action units from the facial action coding system to manipulate facial points of an artificial character inside unreal engine 4 using the OpenFace API. First the artificial character's facial behavior was programmed to mimic human facial expressions on screen. For adequate reaction emotions, we then trained an autoencoder with a long short-term memory model to have a DL model. To validate the performance of our trained model, we compared our results on reaction expressions with our test dataset by using average root-mean-square error. Furthermore, sixteen test participants reported the apparent naturalness of the character's reactions to the dynamic human expressions. Our findings are promising steps in developing facial reaction emotion synthesis into a dynamic system that can adapt to the user's specific needs and context.
引用
收藏
页码:909 / 918
页数:10
相关论文
共 50 条
  • [1] Personality and Facial Expressions in Human-Robot Interaction
    Jung, Soyoung
    Lim, Hyoung-taek
    Kwak, Sanghun
    Biocca, Frank
    [J]. HRI'12: PROCEEDINGS OF THE SEVENTH ANNUAL ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION, 2012, : 161 - 162
  • [2] Affective Facial Expressions Recognition for Human-Robot Interaction
    Faria, Diego R.
    Vieira, Mario
    Faria, Fernanda C. C.
    Premebida, Cristiano
    [J]. 2017 26TH IEEE INTERNATIONAL SYMPOSIUM ON ROBOT AND HUMAN INTERACTIVE COMMUNICATION (RO-MAN), 2017, : 805 - 810
  • [3] Facial Emotion Expressions in Human-Robot Interaction: A Survey
    Rawal, Niyati
    Stock-Homburg, Ruth Maria
    [J]. INTERNATIONAL JOURNAL OF SOCIAL ROBOTICS, 2022, 14 (07) : 1583 - 1604
  • [4] Spontaneous Human-Robot Emotional Interaction Through Facial Expressions
    Meghdari, Ali
    Alemi, Minoo
    Pour, Ali Ghorbandaei
    Taheri, Alireza
    [J]. SOCIAL ROBOTICS, (ICSR 2016), 2016, 9979 : 351 - 361
  • [5] Feedback Interpretation based on Facial Expressions in Human-Robot Interaction
    Lang, Christian
    Hanheide, Marc
    Lohse, Manja
    Wersing, Heiko
    Sagerer, Gerhard
    [J]. RO-MAN 2009: THE 18TH IEEE INTERNATIONAL SYMPOSIUM ON ROBOT AND HUMAN INTERACTIVE COMMUNICATION, VOLS 1 AND 2, 2009, : 915 - +
  • [6] Enhancing Human-Robot Interaction by a Robot Face with Facial Expressions and Synchronized Lip Movements
    Seib, Viktor
    Giesen, Julian
    Gruentjens, Dominik
    Paulus, Dietrich
    [J]. WSCG 2013, COMMUNICATION PAPERS PROCEEDINGS, 2013, : 70 - 77
  • [7] Evaluation of Robot Emotion Expressions for Human-Robot Interaction
    Cardenas, Pedro
    Garcia, Jose
    Begazo, Rolinson
    Aguilera, Ana
    Dongo, Irvin
    Cardinale, Yudith
    [J]. INTERNATIONAL JOURNAL OF SOCIAL ROBOTICS, 2024,
  • [8] Emotionally Assisted Human-Robot Interaction Using a Wearable Device for Reading Facial Expressions
    Gruebler, Anna
    Berenz, Vincent
    Suzuki, Kenji
    [J]. ADVANCED ROBOTICS, 2012, 26 (10) : 1143 - 1159
  • [9] Facial Expression Recognition for Human-Robot Interaction
    Hsu, Shih-Chung
    Huang, Hsin-Hui
    Huang, Chung-Lin
    [J]. 2017 FIRST IEEE INTERNATIONAL CONFERENCE ON ROBOTIC COMPUTING (IRC), 2017, : 1 - 7
  • [10] Human-robot interaction - Facial gesture recognition
    Rudall, BH
    [J]. ROBOTICA, 1996, 14 : 596 - 597