Interaction Intention Recognition via Human Emotion for Human-Robot Natural Interaction

被引:0
|
作者
Yang, Shengtian [1 ]
Guan, Yisheng [1 ]
Li, Yihui [1 ]
Shi, Wenjing [1 ]
机构
[1] Guangdong Univ Technol, Sch Electromech Engn, Guangzhou 510006, Peoples R China
关键词
FACIAL EXPRESSIONS; FACE;
D O I
10.1109/AIM52237.2022.9863357
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In many social scenarios, human emotion governs external behavior, and behavior reflects intention. If a social robot can recognize users' interaction intention through observable behaviors, it can respond in a personalized way, thus exhibiting "natural" behavior. In this paper, we propose a new method for social human-robot interaction, where the robot obtains the facial expression and body action of the people only from RGB videos to recognize whether humans intend to interact with it or not. Through feature extraction and fusion, a computational model of emotion is established to further identify the interlocutors' intention. We use a stacking model with 3 basic classifiers to fuse features and create a real unconstrained dataset to evaluate our method. The result shows that our method which combines facial expression and body action features, is more efficient than single-cue based methods with respect to increased accuracy, robustness, and reduced uncertainty.
引用
收藏
页码:380 / 385
页数:6
相关论文
共 50 条
  • [1] Emotion in human-robot interaction: Recognition and display
    Wendt, Cornalia
    Kuehnlenz, Kolja
    Popp, Michael
    Karg, Michella
    [J]. INTERNATIONAL JOURNAL OF PSYCHOLOGY, 2008, 43 (3-4) : 578 - 578
  • [2] Natural Grasp Intention Recognition Based on Gaze in Human-Robot Interaction
    Yang, Bo
    Huang, Jian
    Chen, Xinxing
    Li, Xiaolong
    Hasegawa, Yasuhisa
    [J]. IEEE JOURNAL OF BIOMEDICAL AND HEALTH INFORMATICS, 2023, 27 (04) : 2059 - 2070
  • [3] Emotion Recognition in Human-Robot Interaction Using the NAO Robot
    Valagkouti, Iro Athina
    Troussas, Christos
    Krouska, Akrivi
    Feidakis, Michalis
    Sgouropoulou, Cleo
    [J]. COMPUTERS, 2022, 11 (05)
  • [4] Expressing reactive emotion based on multimodal emotion recognition for natural conversation in human-robot interaction*
    Li, Yuanchao
    Ishi, Carlos Toshinori
    Inoue, Koji
    Nakamura, Shizuka
    Kawahara, Tatsuya
    [J]. ADVANCED ROBOTICS, 2019, 33 (20) : 1030 - 1041
  • [5] Analysis of Human Emotion in Human-Robot Interaction
    Blar, Noraidah
    Jafar, Fairul Azni
    Abdullah, Nurhidayu
    Muhammad, Mohd Nazrin
    Kassim, Anuar Muhamed
    [J]. INTERNATIONAL CONFERENCE ON MATHEMATICS, ENGINEERING AND INDUSTRIAL APPLICATIONS 2014 (ICOMEIA 2014), 2015, 1660
  • [6] Learning Multimodal Confidence for Intention Recognition in Human-Robot Interaction
    Zhao, Xiyuan
    Li, Huijun
    Miao, Tianyuan
    Zhu, Xianyi
    Wei, Zhikai
    Tan, Lifen
    Song, Aiguo
    [J]. IEEE ROBOTICS AND AUTOMATION LETTERS, 2024, 9 (09): : 7819 - 7826
  • [7] Multimodal Uncertainty Reduction for Intention Recognition in Human-Robot Interaction
    Trick, Susanne
    Koert, Dorothea
    Peters, Jan
    Rothkopf, Constantin A.
    [J]. 2019 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2019, : 7009 - 7016
  • [8] Analysing Action and Intention Recognition in Human-Robot Interaction with ANEMONE
    Alenljung, Beatrice
    Lindblom, Jessica
    [J]. HUMAN-COMPUTER INTERACTION: INTERACTION TECHNIQUES AND NOVEL APPLICATIONS, HCII 2021, PT II, 2021, 12763 : 181 - 200
  • [9] Natural Human-Robot Interaction
    Kanda, Takayuki
    [J]. SIMULATION, MODELING, AND PROGRAMMING FOR AUTONOMOUS ROBOTS, 2010, 6472 : 2 - 2
  • [10] Feature Reduction for Dimensional Emotion Recognition in Human-Robot Interaction
    Banda, Ntombikayise
    Engelbrecht, Andries
    Robinson, Peter
    [J]. 2015 IEEE SYMPOSIUM SERIES ON COMPUTATIONAL INTELLIGENCE (IEEE SSCI), 2015, : 803 - 810