The Role of Coherent Robot Behavior and Embodiment in Emotion Perception and Recognition During Human-Robot Interaction: Experimental Study

被引:0
|
作者
Fiorini, Laura [1 ,2 ,6 ]
D'Onofrio, Grazia [3 ]
Sorrentino, Alessandra [1 ]
Loizzo, Federica Gabriella Cornacchia [2 ]
Russo, Sergio [4 ]
Ciccone, Filomena [3 ]
Giuliani, Francesco [4 ]
Sancarlo, Daniele [5 ]
Cavallo, Filippo [1 ,2 ]
机构
[1] Univ Florence, Dept Ind Engn, Florence, Italy
[2] Scuola Super Sant Anna, BioRobot Inst, Pisa, Italy
[3] Fdn IRCCS Casa Sollievo Sofferenza, Hlth Dept, Clin Psychol Serv, San Giovanni Rotondo, Foggia, Italy
[4] Fdn IRCCS Casa Sollievo Sofferenza, Innovat & Res Unit, San Giovanni Rotondo, Foggia, Italy
[5] Fdn IRCCS Casa Sollievo Sofferenza, Dept Med Sci, Complex Unit Geriatr, San Giovanni Rotondo, Foggia, Italy
[6] Univ Florence, Dept Ind Engn, Via Santa Marta 3, I-50139 Florence, Italy
来源
JMIR HUMAN FACTORS | 2024年 / 11卷
关键词
social robot; emotion recognition; human emotion perception; human -robot interaction; robot cospeech gestures evaluation;
D O I
10.2196/45494
中图分类号
R19 [保健组织与事业(卫生事业管理)];
学科分类号
摘要
Background: Social robots are becoming increasingly important as companions in our daily lives. Consequently, humans expect to interact with them using the same mental models applied to human-human interactions, including the use of cospeech gestures. Research efforts have been devoted to understanding users' needs and developing robot's behavioral models that can perceive the user state and properly plan a reaction. Despite the efforts made, some challenges regarding the effect of robot embodiment and behavior in the perception of emotions remain open. Objective: The aim of this study is dual. First, it aims to assess the role of the robot's cospeech gestures and embodiment in the user's perceived emotions in terms of valence (stimulus pleasantness), arousal (intensity of evoked emotion), and dominance (degree of control exerted by the stimulus). Second, it aims to evaluate the robot's accuracy in identifying positive, negative, and neutral emotions displayed by interacting humans using 3 supervised machine learning algorithms: support vector machine, Methods: Pepper robot was used to elicit the 3 emotions in humans using a set of 60 images retrieved from a standardized database. In particular, 2 experimental conditions for emotion elicitation were performed with Pepper robot: with a static behavior or with a robot that expresses coherent (COH) cospeech behavior. Furthermore, to evaluate the role of the robot embodiment, the third elicitation was performed by asking the participant to interact with a PC, where a graphical interface showed the same images. Each participant was requested to undergo only 1 of the 3 experimental conditions. Results: A total of 60 participants were recruited for this study, 20 for each experimental condition for a total of 3600 interactions. The results showed significant differences (P<.05) in valence, arousal, and dominance when stimulated with the Pepper robot behaving COH with respect to the PC condition, thus underlying the importance of the robot's nonverbal communication and embodiment. A higher valence score was obtained for the elicitation of the robot (COH and robot with static behavior) with respect to the PC. For emotion recognition, the K-nearest neighbor classifiers achieved the best accuracy results. In particular, the COH modality achieved the highest level of accuracy (0.97) when compared with the static behavior and PC elicitations (0.88 and 0.94, respectively).<br /> Conclusions: The results suggest that the use of multimodal communication channels, such as cospeech and visual channels, as in the COH modality, may improve the recognition accuracy of the user's emotional state and can reinforce the perceived emotion. Future studies should investigate the effect of age, culture, and cognitive profile on the emotion perception and recognition going beyond the limitation of this work.
引用
收藏
页数:14
相关论文
共 50 条
  • [1] Emotion Recognition in Human-Robot Interaction Using the NAO Robot
    Valagkouti, Iro Athina
    Troussas, Christos
    Krouska, Akrivi
    Feidakis, Michalis
    Sgouropoulou, Cleo
    [J]. COMPUTERS, 2022, 11 (05)
  • [2] Emotion in human-robot interaction: Recognition and display
    Wendt, Cornalia
    Kuehnlenz, Kolja
    Popp, Michael
    Karg, Michella
    [J]. INTERNATIONAL JOURNAL OF PSYCHOLOGY, 2008, 43 (3-4) : 578 - 578
  • [3] Human-robot collaborative interaction with human perception and action recognition
    Yu, Xinyi
    Zhang, Xin
    Xu, Chengjun
    Ou, Linlin
    [J]. NEUROCOMPUTING, 2024, 563
  • [4] Evaluation of Robot Emotion Expressions for Human-Robot Interaction
    Cardenas, Pedro
    Garcia, Jose
    Begazo, Rolinson
    Aguilera, Ana
    Dongo, Irvin
    Cardinale, Yudith
    [J]. INTERNATIONAL JOURNAL OF SOCIAL ROBOTICS, 2024, : 2019 - 2041
  • [5] Feature Reduction for Dimensional Emotion Recognition in Human-Robot Interaction
    Banda, Ntombikayise
    Engelbrecht, Andries
    Robinson, Peter
    [J]. 2015 IEEE SYMPOSIUM SERIES ON COMPUTATIONAL INTELLIGENCE (IEEE SSCI), 2015, : 803 - 810
  • [6] Multimodal emotion recognition with evolutionary computation for human-robot interaction
    Perez-Gaspar, Luis-Alberto
    Caballero-Morales, Santiago-Omar
    Trujillo-Romero, Felipe
    [J]. EXPERT SYSTEMS WITH APPLICATIONS, 2016, 66 : 42 - 61
  • [7] Emotion Recognition From Speech to Improve Human-robot Interaction
    Zhu, Changrui
    Ahamd, Wasim
    [J]. IEEE 17TH INT CONF ON DEPENDABLE, AUTONOM AND SECURE COMP / IEEE 17TH INT CONF ON PERVAS INTELLIGENCE AND COMP / IEEE 5TH INT CONF ON CLOUD AND BIG DATA COMP / IEEE 4TH CYBER SCIENCE AND TECHNOLOGY CONGRESS (DASC/PICOM/CBDCOM/CYBERSCITECH), 2019, : 370 - 375
  • [8] Interaction Intention Recognition via Human Emotion for Human-Robot Natural Interaction
    Yang, Shengtian
    Guan, Yisheng
    Li, Yihui
    Shi, Wenjing
    [J]. 2022 IEEE/ASME INTERNATIONAL CONFERENCE ON ADVANCED INTELLIGENT MECHATRONICS (AIM), 2022, : 380 - 385
  • [9] Emotion Analysis in Human-Robot Interaction
    Szaboova, Martina
    Sarnovsky, Martin
    Maslej Kresnakova, Viera
    Machova, Kristina
    [J]. ELECTRONICS, 2020, 9 (11) : 1 - 31
  • [10] Adapting robot behavior for human-robot interaction
    Mitsunaga, Noriaki
    Smith, Christian
    Kanda, Takayuki
    Ishiguro, Hiroshi
    Hagita, Norihiro
    [J]. IEEE TRANSACTIONS ON ROBOTICS, 2008, 24 (04) : 911 - 916