Human-Robot Interaction Using Markovian Emotional Model Based on Facial Recognition

被引:5
|
作者
Maeda, Yoichiro [1 ]
Geshi, Shotaro [2 ]
机构
[1] Ritsumeikan Univ, 1-1-1 Noji Higashi, Kusatsu, Shiga 5258577, Japan
[2] Kanden Syst Solut Co Inc, Kita Ku, 3-3-20 Umeda, Osaka 5308226, Japan
关键词
Emotion; Facial Expression; Communication Robot; Markovian Emotional Model; SOM;
D O I
10.1109/SCIS-ISIS.2018.00044
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Interactive emotion communication (IEC) is a research on the communication between human and robot with emotional behaviors. In this paper we propose a method of IEC for emotion generation using Markovian emotional model (MEM) based on the difference of the characteristic quantity in the facial expression of human by self-organizing map (SOM). For example, it is thought that an angry person bursting into tears is caused with high possibility than the angry person bursting into laughter when the emotion of person changes. In other words the expression which a characteristic resembles is easy to change as emotions. In this study. the emotion transition probability is found by the questionnaire, and an interaction experiment with the proposed emotion transition is carried out using a communication robot.
引用
收藏
页码:209 / 214
页数:6
相关论文
共 50 条
  • [41] Emotion in human-robot interaction: Recognition and display
    Wendt, Cornalia
    Kuehnlenz, Kolja
    Popp, Michael
    Karg, Michella
    [J]. INTERNATIONAL JOURNAL OF PSYCHOLOGY, 2008, 43 (3-4) : 578 - 578
  • [42] Recognition in Human-Robot Interaction: The Gateway to Engagement
    Brinck, Ingar
    Balkenius, Christian
    [J]. 2019 JOINT IEEE 9TH INTERNATIONAL CONFERENCE ON DEVELOPMENT AND LEARNING AND EPIGENETIC ROBOTICS (ICDL-EPIROB), 2019, : 31 - 36
  • [43] Gesture spotting and recognition for human-robot interaction
    Yang, Hee-Deok
    Park, A-Yeon
    Lee, Seong-Whan
    [J]. IEEE TRANSACTIONS ON ROBOTICS, 2007, 23 (02) : 256 - 270
  • [44] Facial Emotion Expressions in Human-Robot Interaction: A Survey
    Rawal, Niyati
    Stock-Homburg, Ruth Maria
    [J]. INTERNATIONAL JOURNAL OF SOCIAL ROBOTICS, 2022, 14 (07) : 1583 - 1604
  • [45] Problems with using a human-dog interaction model for human-robot interaction?
    Dahl, Torbjorn S.
    [J]. INTERACTION STUDIES, 2014, 15 (02) : 190 - 194
  • [46] Active Affective Facial Analysis For Human-Robot Interaction
    Ge, Shuzhi Sam
    Samani, Hooman Aghaebrahimi
    Ong, Yin Hao Janus
    Hang, Chang Chieh
    [J]. 2008 17TH IEEE INTERNATIONAL SYMPOSIUM ON ROBOT AND HUMAN INTERACTIVE COMMUNICATION, VOLS 1 AND 2, 2008, : 83 - 88
  • [47] Tracking of Facial Features to Support Human-Robot Interaction
    Pateraki, Maria
    Baltzakis, Haris
    Kondaxakis, Polychronis
    Trahanias, Panos
    [J]. ICRA: 2009 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, VOLS 1-7, 2009, : 2651 - 2656
  • [48] Synthesizing facial expressions in dyadic human-robot interaction
    Sham, Abdallah Hussein
    Tikka, Pia
    Lamas, David
    Anbarjafari, Gholamreza
    [J]. SIGNAL IMAGE AND VIDEO PROCESSING, 2024, 18 (SUPPL 1) : 909 - 918
  • [49] Facial Expressions Recognition for Human-Robot Interaction Using Deep Convolutional Neural Networks with Rectified Adam Optimizer
    Melinte, Daniel Octavian
    Vladareanu, Luige
    [J]. SENSORS, 2020, 20 (08)
  • [50] Convolutional Features-Based Broad Learning With LSTM for Multidimensional Facial Emotion Recognition in Human-Robot Interaction
    Chen, Luefeng
    Li, Min
    Wu, Min
    Pedrycz, Witold
    Hirota, Kaoru
    [J]. IEEE TRANSACTIONS ON SYSTEMS MAN CYBERNETICS-SYSTEMS, 2024, 54 (01): : 64 - 75