Multimodal emotion recognition with evolutionary computation for human-robot interaction

被引:52
|
作者
Perez-Gaspar, Luis-Alberto [1 ]
Caballero-Morales, Santiago-Omar [1 ]
Trujillo-Romero, Felipe [1 ]
机构
[1] Technol Univ Mixteca, Rd Acatlima Km 2-5, Mexico City 69000, DF, Mexico
关键词
Emotion recognition; Principal Component Analysis; Hidden Markov Models; Genetic Algorithms; Artificial Neural Networks; Finite state machines; SPEECH; CLASSIFIERS; FEATURES; FUSION;
D O I
10.1016/j.eswa.2016.08.047
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Service robotics is an important field of research for the development of assistive technologies. Particularly, humanoid robots will play an increasing and important role in our society. More natural assistive interaction with humanoid robots can be achieved if the emotional aspect is considered. However emotion recognition is one of the most challenging topics in pattern recognition and improved intelligent techniques have to be developed to accomplish this goal. Recent research has addressed the emotion recognition problem with techniques such as Artificial Neural Networks (ANNs)/Hidden Markov Models (HMMs) and reliability of proposed approaches has been assessed (in most cases) with standard databases. In this work we (1) explored on the implications of using standard databases for assessment of emotion recognition techniques, (2) extended on the evolutionary optimization of ANNs and HMMs for the development of a multimodal emotion recognition system, (3) set the guidelines for the development of emotional databases of speech and facial expressions, (4) rules were set for phonetic transcription of Mexican speech, and (5) evaluated the suitability of the multimodal system within the context of spoken dialogue between a humanoid robot and human users. The development of intelligent systems for emotion recognition can be improved by the findings of the present work: (a) emotion recognition depends on the structure of the database sub-sets used for training and testing, and it also depends on the type of technique used for recognition where a specific emotion can be highly recognized by a specific technique, (b) optimization of HMMs led to a Bakis structure which is more suitable for acoustic modeling of emotion-specific vowels while optimization of ANNs led to a more suitable ANN structure for recognition of facial expressions, (c) some emotions can be better recognized based on speech patterns instead of visual patterns, and (d) the weighted integration of the multimodal emotion recognition system optimized with these observations can achieve a recognition rate up to 97.00 % in live dialogue tests with a humanoid robot. (C) 2016 Elsevier Ltd. All rights reserved.
引用
收藏
页码:42 / 61
页数:20
相关论文
共 50 条
  • [11] Learning Multimodal Confidence for Intention Recognition in Human-Robot Interaction
    Zhao, Xiyuan
    Li, Huijun
    Miao, Tianyuan
    Zhu, Xianyi
    Wei, Zhikai
    Tan, Lifen
    Song, Aiguo
    [J]. IEEE ROBOTICS AND AUTOMATION LETTERS, 2024, 9 (09): : 7819 - 7826
  • [12] Emotion Recognition From Speech to Improve Human-robot Interaction
    Zhu, Changrui
    Ahamd, Wasim
    [J]. IEEE 17TH INT CONF ON DEPENDABLE, AUTONOM AND SECURE COMP / IEEE 17TH INT CONF ON PERVAS INTELLIGENCE AND COMP / IEEE 5TH INT CONF ON CLOUD AND BIG DATA COMP / IEEE 4TH CYBER SCIENCE AND TECHNOLOGY CONGRESS (DASC/PICOM/CBDCOM/CYBERSCITECH), 2019, : 370 - 375
  • [13] Multimodal Interaction for Human-Robot Teams
    Burke, Dustin
    Schurr, Nathan
    Ayers, Jeanine
    Rousseau, Jeff
    Fertitta, John
    Carlin, Alan
    Dumond, Danielle
    [J]. UNMANNED SYSTEMS TECHNOLOGY XV, 2013, 8741
  • [14] Emotion Analysis in Human-Robot Interaction
    Szaboova, Martina
    Sarnovsky, Martin
    Maslej Kresnakova, Viera
    Machova, Kristina
    [J]. ELECTRONICS, 2020, 9 (11) : 1 - 31
  • [15] A Facial Expression Emotion Recognition Based Human-robot Interaction System
    Zhentao Liu
    Min Wu
    Weihua Cao
    Luefeng Chen
    Jianping Xu
    Ri Zhang
    Mengtian Zhou
    Junwei Mao
    [J]. IEEE/CAA Journal of Automatica Sinica, 2017, 4 (04) : 668 - 676
  • [16] Emotion recognition in non-structured utterances for human-robot interaction
    Martínez, CK
    Cruz, AB
    [J]. 2005 IEEE International Workshop on Robot and Human Interactive Communication (RO-MAN), 2005, : 19 - 23
  • [17] Emotion Recognition for Human-Robot Interaction: Recent Advances and Future Perspectives
    Spezialetti, Matteo
    Placidi, Giuseppe
    Rossi, Silvia
    [J]. FRONTIERS IN ROBOTICS AND AI, 2020, 7
  • [18] A Facial Expression Emotion Recognition Based Human-robot Interaction System
    Liu, Zhentao
    Wu, Min
    Cao, Weihua
    Chen, Luefeng
    Xu, Jianping
    Zhang, Ri
    Zhou, Mengtian
    Mao, Junwei
    [J]. IEEE-CAA JOURNAL OF AUTOMATICA SINICA, 2017, 4 (04) : 668 - 676
  • [19] Analysis of Human Emotion in Human-Robot Interaction
    Blar, Noraidah
    Jafar, Fairul Azni
    Abdullah, Nurhidayu
    Muhammad, Mohd Nazrin
    Kassim, Anuar Muhamed
    [J]. INTERNATIONAL CONFERENCE ON MATHEMATICS, ENGINEERING AND INDUSTRIAL APPLICATIONS 2014 (ICOMEIA 2014), 2015, 1660
  • [20] Hierarchical Attention Approach in Multimodal Emotion Recognition for Human Robot Interaction
    Abdullah, Muhammad
    Ahmad, Mobeen
    Han, Dongil
    [J]. 2021 36TH INTERNATIONAL TECHNICAL CONFERENCE ON CIRCUITS/SYSTEMS, COMPUTERS AND COMMUNICATIONS (ITC-CSCC), 2021,