Inference of Human Beings’ Emotional States from Speech in Human–Robot Interactions

被引:0
|
作者
Laurence Devillers
Marie Tahon
Mohamed A. Sehili
Agnes Delaborde
机构
[1] LIMSI-CNRS,
[2] Université Paris-Sorbonne IV,undefined
来源
International Journal of Social Robotics | 2015年 / 7卷
关键词
Human–robot interaction; Emotion recognition; Prediction reliability; Real-life data;
D O I
暂无
中图分类号
学科分类号
摘要
The challenge of this study is twofold: recognizing emotions from audio signals in naturalistic Human–Robot Interaction (HRI) environment, and using a cross-dataset recognition for robustness evaluation. The originality of this work lies in the use of six emotional models in parallel, generated using two training corpora and three acoustic feature sets. The models are obtained from two databases collected in different tasks, and a third independent real-life HRI corpus (collected within the ROMEO project—http://www.projetromeo.com/) is used for test. As primary results, for the task of four-emotion recognition, and by combining the probabilistic outputs of six different systems in a very simplistic way, we obtained better results compared to the best baseline system. Moreover, to investigate the potential of fusing many systems’ outputs using a “perfect” fusion method, we calculate the oracle performance (oracle considers a correct prediction if at least one of the systems outputs a correct prediction). The obtained oracle score is 73 % while the auto-coherence score on the same corpus (i.e. performance obtained by using the same data for training and for testing) is about 57 %. We experiment a reliability estimation protocol that makes use of outputs from many systems. Such reliability measurement of an emotion recognition system’s decision could help to construct a relevant emotional and interactional user profile which could be used to drive the expressive behavior of the robot.
引用
收藏
页码:451 / 463
页数:12
相关论文
共 50 条
  • [21] Towards automatic detection of emotional messages from human speech
    Kostov, V
    Fukuda, S
    Beric, A
    INTELLIGENT AUTONOMOUS SYSTEMS 6, 2000, : 347 - 354
  • [22] Learning Legible Motion from Human–Robot Interactions
    Baptiste Busch
    Jonathan Grizou
    Manuel Lopes
    Freek Stulp
    International Journal of Social Robotics, 2017, 9 : 765 - 779
  • [23] Human-robot interactions
    You, Sangseok
    Robert, Lionel P.
    Proceedings of the Annual Hawaii International Conference on System Sciences, 2020, 2020-January
  • [24] I, robot: depression plays different roles in human–human and human–robot interactions
    Dandan Zhang
    Junshi Shen
    Sijin Li
    Kexiang Gao
    Ruolei Gu
    Translational Psychiatry, 11
  • [25] Comparison of Lexical Alignment with a Teachable Robot in Human-Robot and Human-Human-Robot Interactions
    Asanol, Yuya
    Litman, Diane
    Yu, Mingzhi
    Lobczowski, Nikki
    Nokes-Malach, Timothy
    Kovashkal, Adriana
    Walker, Erin
    23RD ANNUAL MEETING OF THE SPECIAL INTEREST GROUP ON DISCOURSE AND DIALOGUE, SIGDIAL 2022, 2022, : 615 - 622
  • [26] Power in Human Robot Interactions
    Ju, Wendy
    WHAT SOCIAL ROBOTS CAN AND SHOULD DO, 2016, 290 : 13 - 14
  • [27] Investigation of Human Emotional State in Human-Robot Collaboration
    Jafar, Fairul Azni
    Abdullah, Nurhidayu
    Muhammad, Mohd Nazrin
    Zakaria, Nurul Azma
    Mokhtar, Mohd Najib Ali
    JOURNAL OF COMPUTERS, 2014, 9 (03) : 668 - 677
  • [28] From the diversity of beings to the human being
    Tinland, F
    REVUE PHILOSOPHIQUE DE LOUVAIN, 2002, 100 (03) : 385 - 417
  • [29] A Scoping Review of the Literature On Prosodic Elements Related to Emotional Speech in Human-Robot Interaction
    Gasteiger, Norina
    Lim, JongYoon
    Hellou, Mehdi
    MacDonald, Bruce A.
    Ahn, Ho Seok
    INTERNATIONAL JOURNAL OF SOCIAL ROBOTICS, 2024, 16 (04) : 659 - 670
  • [30] A Letter To Human Beings From A Frog
    沈高春
    中学英语园地(高一版), 2007, (11) : 2 - 3