Learning Bodily Expression of Emotion for Social Robots Through Human Interaction

被引:9
|
作者
Tuyen, Nguyen Tan Viet [1 ]
Elibol, Armagan [1 ]
Chong, Nak Young [1 ]
机构
[1] Japan Adv Inst Sci & Technol, Sch Informat Sci, Nomi 9231211, Japan
关键词
Robots; Psychology; Trajectory; Unsupervised learning; Cultural differences; Human-robot interaction; Global communication; Affective behaviors; cross-cultural evaluation; human– robot interaction (HRI); imitation learning;
D O I
10.1109/TCDS.2020.3005907
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Human facial and bodily expressions play a crucial role in human-human interaction to convey the communicator's feelings. Being echoed by the influence of human social behavior, recent studies in human-robot interaction (HRI) have investigated how to generate emotional behaviors for social robots. Emotional behaviors can enhance user engagement, allowing the user to interact with robots in a transparent manner. However, they are ambiguous and affected by many factors, such as personality traits, cultures, and environments. This article focuses on developing the robot's emotional bodily expressions adopting the user's affective gestures. We propose the behavior selection and transformation model, enabling the robots to incrementally learn from the user's gestures, to select the user's habitual behaviors, and to transform the selected behaviors into robot motions. The experimental results under several scenarios showed that the proposed incremental learning model endows a social robot with the capability of entering into a positive, long-lasting HRI. We have also confirmed that the robot can express emotions through the imitated motions of the user. The robot's emotional gestures that reflected the interacting partner's traits were widely accepted within the same cultural group, and perceptible across different cultural groups in different ways.
引用
收藏
页码:16 / 30
页数:15
相关论文
共 50 条
  • [21] Social Robots and Social Interaction
    Hakli, Raul
    SOCIABLE ROBOTS AND THE FUTURE OF SOCIAL RELATIONS, 2014, 273 : 105 - 114
  • [22] Learning tastes through social interaction
    Hsiaw, Alice
    JOURNAL OF ECONOMIC BEHAVIOR & ORGANIZATION, 2014, 107 : 64 - 85
  • [23] Emotion and memory model for social robots: a reinforcement learning based behaviour selection
    Ahmad, Muneeb Imtiaz
    Gao, Yuan
    Alnajjar, Fady
    Shahid, Suleman
    Mubin, Omar
    BEHAVIOUR & INFORMATION TECHNOLOGY, 2022, 41 (15) : 3210 - 3236
  • [24] Emotion space modelling for social robots
    Yan, Fei
    Iliyasu, Abdullah M.
    Hirota, Kaoru
    ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2021, 100
  • [25] Human-Robots Interaction by Facial Expression Recognition
    Zekhnine, Cherifa
    Berrached, Nasr Eddine
    INTERNATIONAL JOURNAL OF ENGINEERING RESEARCH IN AFRICA, 2020, 46 : 76 - 87
  • [26] Multimodal Expression of Artificial Emotion in Social Robots Using Color, Motion and Sound
    Loeffler, Diana
    Schmidt, Nina
    Tscharn, Robert
    HRI '18: PROCEEDINGS OF THE 2018 ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION, 2018, : 334 - 343
  • [27] Emotion specific body movements Studying humans to augment robots' bodily expressions
    Wasala, Kossinna
    Gomez, Rafael
    Donovan, Jared
    Chamorro-Koc, Marianella
    PROCEEDINGS OF THE 31ST AUSTRALIAN CONFERENCE ON HUMAN-COMPUTER-INTERACTION (OZCHI'19), 2020, : 503 - 507
  • [28] Emotion-sensitive robots: A new paradigm for human-robot interaction
    Rani, P
    Sarkar, N
    2004 4TH IEEE/RAS INTERNATIONAL CONFERENCE ON HUMANOID ROBOTS, VOLS 1 AND 2, PROCEEDINGS, 2004, : 149 - 167
  • [29] Learning Emotional Understanding and Emotion Regulation Through Sibling Interaction
    Kramer, Laurie
    EARLY EDUCATION AND DEVELOPMENT, 2014, 25 (02): : 160 - 184
  • [30] Modulating Perceived Authority and Warmth of Mobile Social Robots Through Bodily Openness and Vertical Movement in Gait
    Fu, Changzeng
    Wang, Songyang
    Li, Zihan
    Gupta, Ayush
    Meneses, Alexis
    Ishiguro, Hiroshi
    Yoshikawa, Yuichiro
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2024, 9 (09): : 7971 - 7978