Interactive Robot Learning for Multimodal Emotion Recognition

被引:24
|
作者
Yu, Chuang [1 ]
Tapus, Adriana [1 ]
机构
[1] ENSTA Paris Inst Polytech Paris, Autonomous Syst & Robot Lab, U2IS, 828 Blvd Marechaux, F-91120 Palaiseau, France
来源
SOCIAL ROBOTICS, ICSR 2019 | 2019年 / 11876卷
关键词
Interactive robot learning; Multimodal emotion recognition; Human-robot interaction;
D O I
10.1007/978-3-030-35888-4_59
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Interaction plays a critical role in skills learning for natural communication. In human-robot interaction (HRI), robots can get feedback during the interaction to improve their social abilities. In this context, we propose an interactive robot learning framework using multimodal data from thermal facial images and human gait data for online emotion recognition. We also propose a new decision-level fusion method for the multimodal classification using Random Forest (RF) model. Our hybrid online emotion recognition model focuses on the detection of four human emotions (i.e., neutral, happiness, angry, and sadness). After conducting offline training and testing with the hybrid model, the accuracy of the online emotion recognition system is more than 10% lower than the offline one. In order to improve our system, the human verbal feedback is injected into the robot interactive learning. With the new online emotion recognition system, a 12.5% accuracy increase compared with the online system without interactive robot learning is obtained.
引用
收藏
页码:633 / 642
页数:10
相关论文
共 50 条
  • [1] Multimodal Emotion Recognition for Human Robot Interaction
    Adiga, Sharvari
    Vaishnavi, D. V.
    Saxena, Suchitra
    ShikhaTripathi
    [J]. 2020 7TH INTERNATIONAL CONFERENCE ON SOFT COMPUTING & MACHINE INTELLIGENCE (ISCMI 2020), 2020, : 197 - 203
  • [2] Emotion interactive robot focus on speaker independently emotion recognition
    Kim, Eun Ho
    Hyun, Kyung Hak
    Kim, Soo Hyun
    Kwak, Yoon Keun
    [J]. 2007 IEEE/ASME INTERNATIONAL CONFERENCE ON ADVANCED INTELLIGENT MECHATRONICS, VOLS 1-3, 2007, : 280 - 285
  • [3] Interactive Multimodal Attention Network for Emotion Recognition in Conversation
    Ren, Minjie
    Huang, Xiangdong
    Shi, Xiaoqi
    Nie, Weizhi
    [J]. IEEE SIGNAL PROCESSING LETTERS, 2021, 28 : 1046 - 1050
  • [4] Emotion Recognition from Speech for an Interactive Robot Agent
    Anjum, Madiha
    [J]. 2019 IEEE/SICE INTERNATIONAL SYMPOSIUM ON SYSTEM INTEGRATION (SII), 2019, : 363 - 368
  • [5] Multimodal Knowledge-enhanced Interactive Network with Mixed Contrastive Learning for Emotion Recognition in Conversation
    Shen, Xudong
    Huang, Xianying
    Zou, Shihao
    Gan, Xinyi
    [J]. NEUROCOMPUTING, 2024, 582
  • [6] Emotion Recognition Using Multimodal Deep Learning
    Liu, Wei
    Zheng, Wei-Long
    Lu, Bao-Liang
    [J]. NEURAL INFORMATION PROCESSING, ICONIP 2016, PT II, 2016, 9948 : 521 - 529
  • [7] Emotion Recognition on Multimodal with Deep Learning and Ensemble
    Dharma, David Adi
    Zahra, Amalia
    [J]. International Journal of Advanced Computer Science and Applications, 2022, 13 (12): : 656 - 663
  • [8] Emotion Recognition on Multimodal with Deep Learning and Ensemble
    Dharma, David Adi
    Zahra, Amalia
    [J]. INTERNATIONAL JOURNAL OF ADVANCED COMPUTER SCIENCE AND APPLICATIONS, 2022, 13 (12) : 656 - 663
  • [9] Disentangled Representation Learning for Multimodal Emotion Recognition
    Yang, Dingkang
    Huang, Shuai
    Kuang, Haopeng
    Du, Yangtao
    Zhang, Lihua
    [J]. PROCEEDINGS OF THE 30TH ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA, MM 2022, 2022, : 1642 - 1651
  • [10] Comparing Recognition Performance and Robustness of Multimodal Deep Learning Models for Multimodal Emotion Recognition
    Liu, Wei
    Qiu, Jie-Lin
    Zheng, Wei-Long
    Lu, Bao-Liang
    [J]. IEEE TRANSACTIONS ON COGNITIVE AND DEVELOPMENTAL SYSTEMS, 2022, 14 (02) : 715 - 729