Interaction Intention Recognition via Human Emotion for Human-Robot Natural Interaction

被引:0
|
作者
Yang, Shengtian [1 ]
Guan, Yisheng [1 ]
Li, Yihui [1 ]
Shi, Wenjing [1 ]
机构
[1] Guangdong Univ Technol, Sch Electromech Engn, Guangzhou 510006, Peoples R China
关键词
FACIAL EXPRESSIONS; FACE;
D O I
10.1109/AIM52237.2022.9863357
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In many social scenarios, human emotion governs external behavior, and behavior reflects intention. If a social robot can recognize users' interaction intention through observable behaviors, it can respond in a personalized way, thus exhibiting "natural" behavior. In this paper, we propose a new method for social human-robot interaction, where the robot obtains the facial expression and body action of the people only from RGB videos to recognize whether humans intend to interact with it or not. Through feature extraction and fusion, a computational model of emotion is established to further identify the interlocutors' intention. We use a stacking model with 3 basic classifiers to fuse features and create a real unconstrained dataset to evaluate our method. The result shows that our method which combines facial expression and body action features, is more efficient than single-cue based methods with respect to increased accuracy, robustness, and reduced uncertainty.
引用
收藏
页码:380 / 385
页数:6
相关论文
共 50 条
  • [31] Intention Based Comparative Analysis of Human-Robot Interaction
    Awais, Muhammad
    Saeed, Muhammad Yahya
    Malik, Muhammad Sheraz Arshad
    Younas, Muhammad
    Rao Iqbal Asif, Sohail
    [J]. IEEE ACCESS, 2020, 8 : 205821 - 205835
  • [32] Speech Emotion Recognition Using an Enhanced Kernel Isomap for Human-Robot Interaction
    Zhang, Shiqing
    Zhao, Xiaoming
    Lei, Bicheng
    [J]. INTERNATIONAL JOURNAL OF ADVANCED ROBOTIC SYSTEMS, 2013, 10
  • [33] Speech emotion recognition in real static and dynamic human-robot interaction scenarios
    Grageda, Nicolas
    Busso, Carlos
    Alvarado, Eduardo
    Garcia, Ricardo
    Mahu, Rodrigo
    Huenupan, Fernando
    Yoma, Nestor Becerra
    [J]. COMPUTER SPEECH AND LANGUAGE, 2025, 89
  • [34] Intention based Control for Physical Human-robot Interaction
    Lyu, Shangke
    Cheah, Chien Chern
    [J]. PROCEEDINGS OF 2018 IEEE INTERNATIONAL CONFERENCE ON REAL-TIME COMPUTING AND ROBOTICS (IEEE RCAR), 2018, : 1 - 6
  • [35] Mandarin Emotion Recognition Based on Multifractal Theory Towards Human-Robot Interaction
    Liu, Hong
    Zhang, Wenjuan
    [J]. 2013 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND BIOMIMETICS (ROBIO), 2013, : 593 - 598
  • [36] Confidence Fusion Based Emotion Recognition of Multiple Persons for Human-Robot Interaction
    Luo, Ren C.
    Lin, Pei Hsien
    Chang, Li Wen
    [J]. 2012 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2012, : 4590 - 4595
  • [37] Interactive Emotion Recognition Using Support Vector Machine for Human-Robot Interaction
    Tsai, Ching-Chih
    Chen, You-Zhu
    Liao, Ching-Wen
    [J]. 2009 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN AND CYBERNETICS (SMC 2009), VOLS 1-9, 2009, : 407 - 412
  • [38] Face recognition and tracking for human-robot interaction
    Song, KT
    Chen, WJ
    [J]. 2004 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN & CYBERNETICS, VOLS 1-7, 2004, : 2877 - 2882
  • [39] Recognition in Human-Robot Interaction: The Gateway to Engagement
    Brinck, Ingar
    Balkenius, Christian
    [J]. 2019 JOINT IEEE 9TH INTERNATIONAL CONFERENCE ON DEVELOPMENT AND LEARNING AND EPIGENETIC ROBOTICS (ICDL-EPIROB), 2019, : 31 - 36
  • [40] Human-robot interaction - Facial gesture recognition
    Rudall, BH
    [J]. ROBOTICA, 1996, 14 : 596 - 597