Multimodal emotion recognition with evolutionary computation for human-robot interaction

被引:52
|
作者
Perez-Gaspar, Luis-Alberto [1 ]
Caballero-Morales, Santiago-Omar [1 ]
Trujillo-Romero, Felipe [1 ]
机构
[1] Technol Univ Mixteca, Rd Acatlima Km 2-5, Mexico City 69000, DF, Mexico
关键词
Emotion recognition; Principal Component Analysis; Hidden Markov Models; Genetic Algorithms; Artificial Neural Networks; Finite state machines; SPEECH; CLASSIFIERS; FEATURES; FUSION;
D O I
10.1016/j.eswa.2016.08.047
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Service robotics is an important field of research for the development of assistive technologies. Particularly, humanoid robots will play an increasing and important role in our society. More natural assistive interaction with humanoid robots can be achieved if the emotional aspect is considered. However emotion recognition is one of the most challenging topics in pattern recognition and improved intelligent techniques have to be developed to accomplish this goal. Recent research has addressed the emotion recognition problem with techniques such as Artificial Neural Networks (ANNs)/Hidden Markov Models (HMMs) and reliability of proposed approaches has been assessed (in most cases) with standard databases. In this work we (1) explored on the implications of using standard databases for assessment of emotion recognition techniques, (2) extended on the evolutionary optimization of ANNs and HMMs for the development of a multimodal emotion recognition system, (3) set the guidelines for the development of emotional databases of speech and facial expressions, (4) rules were set for phonetic transcription of Mexican speech, and (5) evaluated the suitability of the multimodal system within the context of spoken dialogue between a humanoid robot and human users. The development of intelligent systems for emotion recognition can be improved by the findings of the present work: (a) emotion recognition depends on the structure of the database sub-sets used for training and testing, and it also depends on the type of technique used for recognition where a specific emotion can be highly recognized by a specific technique, (b) optimization of HMMs led to a Bakis structure which is more suitable for acoustic modeling of emotion-specific vowels while optimization of ANNs led to a more suitable ANN structure for recognition of facial expressions, (c) some emotions can be better recognized based on speech patterns instead of visual patterns, and (d) the weighted integration of the multimodal emotion recognition system optimized with these observations can achieve a recognition rate up to 97.00 % in live dialogue tests with a humanoid robot. (C) 2016 Elsevier Ltd. All rights reserved.
引用
收藏
页码:42 / 61
页数:20
相关论文
共 50 条
  • [1] Expressing reactive emotion based on multimodal emotion recognition for natural conversation in human-robot interaction*
    Li, Yuanchao
    Ishi, Carlos Toshinori
    Inoue, Koji
    Nakamura, Shizuka
    Kawahara, Tatsuya
    [J]. ADVANCED ROBOTICS, 2019, 33 (20) : 1030 - 1041
  • [2] Emotion in human-robot interaction: Recognition and display
    Wendt, Cornalia
    Kuehnlenz, Kolja
    Popp, Michael
    Karg, Michella
    [J]. INTERNATIONAL JOURNAL OF PSYCHOLOGY, 2008, 43 (3-4) : 578 - 578
  • [3] Multimodal Emotion Recognition for Human Robot Interaction
    Adiga, Sharvari
    Vaishnavi, D. V.
    Saxena, Suchitra
    ShikhaTripathi
    [J]. 2020 7TH INTERNATIONAL CONFERENCE ON SOFT COMPUTING & MACHINE INTELLIGENCE (ISCMI 2020), 2020, : 197 - 203
  • [4] Multimodal Emotion Recognition with Thermal and RGB-D Cameras for Human-Robot Interaction
    Yu, Chuang
    Tapus, Adriana
    [J]. HRI'20: COMPANION OF THE 2020 ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION, 2020, : 532 - 534
  • [5] Emotion Recognition in Human-Robot Interaction Using the NAO Robot
    Valagkouti, Iro Athina
    Troussas, Christos
    Krouska, Akrivi
    Feidakis, Michalis
    Sgouropoulou, Cleo
    [J]. COMPUTERS, 2022, 11 (05)
  • [6] A Multimodal Emotion Detection System during Human-Robot Interaction
    Alonso-Martin, Fernando
    Malfaz, Maria
    Sequeira, Joao
    Gorostiza, Javier F.
    Salichs, Miguel A.
    [J]. SENSORS, 2013, 13 (11) : 15549 - 15581
  • [7] MULTIMODAL HUMAN ACTION RECOGNITION IN ASSISTIVE HUMAN-ROBOT INTERACTION
    Rodomagoulakis, I.
    Kardaris, N.
    Pitsikalis, V.
    Mavroudi, E.
    Katsamanis, A.
    Tsiami, A.
    Maragos, P.
    [J]. 2016 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING PROCEEDINGS, 2016, : 2702 - 2706
  • [8] Interaction Intention Recognition via Human Emotion for Human-Robot Natural Interaction
    Yang, Shengtian
    Guan, Yisheng
    Li, Yihui
    Shi, Wenjing
    [J]. 2022 IEEE/ASME INTERNATIONAL CONFERENCE ON ADVANCED INTELLIGENT MECHATRONICS (AIM), 2022, : 380 - 385
  • [9] Feature Reduction for Dimensional Emotion Recognition in Human-Robot Interaction
    Banda, Ntombikayise
    Engelbrecht, Andries
    Robinson, Peter
    [J]. 2015 IEEE SYMPOSIUM SERIES ON COMPUTATIONAL INTELLIGENCE (IEEE SSCI), 2015, : 803 - 810
  • [10] Multimodal Uncertainty Reduction for Intention Recognition in Human-Robot Interaction
    Trick, Susanne
    Koert, Dorothea
    Peters, Jan
    Rothkopf, Constantin A.
    [J]. 2019 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2019, : 7009 - 7016