Inference of Human Beings’ Emotional States from Speech in Human–Robot Interactions

被引:0
|
作者
Laurence Devillers
Marie Tahon
Mohamed A. Sehili
Agnes Delaborde
机构
[1] LIMSI-CNRS,
[2] Université Paris-Sorbonne IV,undefined
关键词
Human–robot interaction; Emotion recognition; Prediction reliability; Real-life data;
D O I
暂无
中图分类号
学科分类号
摘要
The challenge of this study is twofold: recognizing emotions from audio signals in naturalistic Human–Robot Interaction (HRI) environment, and using a cross-dataset recognition for robustness evaluation. The originality of this work lies in the use of six emotional models in parallel, generated using two training corpora and three acoustic feature sets. The models are obtained from two databases collected in different tasks, and a third independent real-life HRI corpus (collected within the ROMEO project—http://www.projetromeo.com/) is used for test. As primary results, for the task of four-emotion recognition, and by combining the probabilistic outputs of six different systems in a very simplistic way, we obtained better results compared to the best baseline system. Moreover, to investigate the potential of fusing many systems’ outputs using a “perfect” fusion method, we calculate the oracle performance (oracle considers a correct prediction if at least one of the systems outputs a correct prediction). The obtained oracle score is 73 % while the auto-coherence score on the same corpus (i.e. performance obtained by using the same data for training and for testing) is about 57 %. We experiment a reliability estimation protocol that makes use of outputs from many systems. Such reliability measurement of an emotion recognition system’s decision could help to construct a relevant emotional and interactional user profile which could be used to drive the expressive behavior of the robot.
引用
收藏
页码:451 / 463
页数:12
相关论文
共 50 条
  • [1] Inference of Human Beings' Emotional States from Speech in Human-Robot Interactions
    Devillers, Laurence
    Tahon, Marie
    Sehili, Mohamed A.
    Delaborde, Agnes
    INTERNATIONAL JOURNAL OF SOCIAL ROBOTICS, 2015, 7 (04) : 451 - 463
  • [2] Affordance Learning And Inference Based on Vision-Speech Association in Human-robot Interactions
    Yi, Chang'an
    Min, Huaqing
    Zhu, Jinhui
    Xu, Xinshi
    Yin, Pengshuai
    Zheng, Guofei
    2017 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND BIOMIMETICS (IEEE ROBIO 2017), 2017, : 743 - 749
  • [3] On a certain emotional blindness in human beings
    Scheibe, KE
    JOURNAL OF NARRATIVE AND LIFE HISTORY, 1995, 5 (03): : 239 - 245
  • [4] Influence of Emotional Motions in Human-Robot Interactions
    Dubois, Magda
    Claret, Josep-Arnau
    Basanez, Luis
    Venture, Gentiane
    2016 INTERNATIONAL SYMPOSIUM ON EXPERIMENTAL ROBOTICS, 2017, 1 : 799 - 808
  • [5] Cooperative handling robot with human beings
    Kanki, K
    Osaki, H
    Kajihara, Y
    Munesawa, Y
    Ohta, K
    ISIM'2000: PROCEEDINGS OF THE FIFTH CHINA-JAPAN INTERNATIONAL SYMPOSIUM ON INDUSTRIAL MANAGEMENT, 2000, : 602 - 607
  • [6] An Open Source Emotional Speech Corpus for Human Robot Interaction Applications
    James, Jesin
    Tian, Li
    Watson, Catherine Inez
    19TH ANNUAL CONFERENCE OF THE INTERNATIONAL SPEECH COMMUNICATION ASSOCIATION (INTERSPEECH 2018), VOLS 1-6: SPEECH RESEARCH FOR EMERGING MARKETS IN MULTILINGUAL SOCIETIES, 2018, : 2768 - 2772
  • [7] Effects of Gaze and Speech in Human-Robot Medical Interactions
    Diethelm, Isabella Glans
    Hansen, Sara Skov
    Leth, Frederikke Birkeholm
    Fischer, Kerstin
    Palinko, Oskar
    HRI '21: COMPANION OF THE 2021 ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION, 2021, : 349 - 353
  • [8] INTERACTIONS BETWEEN GONOCOCCI AND HUMAN BEINGS
    ROBERTS, RB
    BULLETIN OF THE NEW YORK ACADEMY OF MEDICINE, 1975, 51 (10) : 1183 - 1184
  • [9] Joint Inference of States, Robot Knowledge, and Human (False-)Beliefs
    Yuan, Tao
    Liu, Hangxin
    Fan, Lifeng
    Zheng, Zilong
    Gao, Tao
    Zhu, Yixin
    Zhu, Song-Chun
    2020 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), 2020, : 5972 - 5978
  • [10] EMOGIB: Emotional Gibberish Speech Database for Affective Human-Robot Interaction
    Yilmazyildiz, Selma
    Henderickx, David
    Vanderborght, Bram
    Verhelst, Werner
    Soetens, Eric
    Lefeber, Dirk
    AFFECTIVE COMPUTING AND INTELLIGENT INTERACTION, PT II, 2011, 6975 : 163 - 172