Facial expression recognition and tracking for intelligent human-robot interaction

被引:20
|
作者
Yang, Y. [1 ,2 ]
Ge, S. S. [1 ,2 ]
Lee, T. H. [1 ,2 ]
Wang, C. [1 ,2 ]
机构
[1] Natl Univ Singapore, Social Robot Lab, Interact Digital Media Inst, Singapore 117576, Singapore
[2] Natl Univ Singapore, Dept Elect Comp Engn, Singapore 117576, Singapore
关键词
Human-Robot interaction; Facial expression recognition; Affective computing; Distributed locally linear embedding; Facial expression motion energy;
D O I
10.1007/s11370-007-0014-z
中图分类号
TP24 [机器人技术];
学科分类号
080202 ; 1405 ;
摘要
For effective interaction between humans and socially adept, intelligent service robots, a key capability required by this class of sociable robots is the successful interpretation of visual data. In addition to crucial techniques like human face detection and recognition, an important next step for enabling intelligence and empathy within social robots is that of emotion recognition. In this paper, an automated and interactive computer vision system is investigated for human facial expression recognition and tracking based on the facial structure features and movement information. Twenty facial features are adopted since they are more informative and prominent for reducing the ambiguity during classification. An unsupervised learning algorithm, distributed locally linear embedding (DLLE), is introduced to recover the inherent properties of scattered data lying on a manifold embedded in high-dimensional input facial images. The selected person-dependent facial expression images in a video are classified using the DLLE. In addition, facial expression motion energy is introduced to describe the facial muscle's tension during the expressions for person-independent tracking for person-independent recognition. This method takes advantage of the optical flow which tracks the feature points' movement information. Finally, experimental results show that our approach is able to separate different expressions successfully.
引用
收藏
页码:143 / 157
页数:15
相关论文
共 50 条
  • [21] Robot Facial Expression Framework for Enhancing Empathy in Human-Robot Interaction
    Park, Ung
    Kim, Minsoo
    Jang, Youngeun
    Lee, Gijae
    Kim, Kanggeon
    Kim, Ig-Jae
    Choi, Jongsuk
    [J]. 2021 30TH IEEE INTERNATIONAL CONFERENCE ON ROBOT AND HUMAN INTERACTIVE COMMUNICATION (RO-MAN), 2021, : 832 - 838
  • [22] Gesture recognition based on arm tracking for human-robot interaction
    Sigalas, Markos
    Baltzakis, Haris
    Trahanias, Panos
    [J]. IEEE/RSJ 2010 INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS 2010), 2010, : 5424 - 5429
  • [23] Face tracking and hand gesture recognition for human-robot interaction
    Brèthes, L
    Menezes, P
    Lerasle, F
    Hayet, J
    [J]. 2004 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, VOLS 1- 5, PROCEEDINGS, 2004, : 1901 - 1906
  • [24] Robotics facial expression of anger in collaborative human-robot interaction
    Reyes, Mauricio E.
    Meza, Ivan, V
    Pineda, Luis A.
    [J]. INTERNATIONAL JOURNAL OF ADVANCED ROBOTIC SYSTEMS, 2019, 16 (01)
  • [25] Weight-Adapted Convolution Neural Network for Facial Expression Recognition in Human-Robot Interaction
    Wu, Min
    Su, Wanjuan
    Chen, Luefeng
    Liu, Zhentao
    Cao, Weihua
    Hirota, Kaoru
    [J]. IEEE TRANSACTIONS ON SYSTEMS MAN CYBERNETICS-SYSTEMS, 2021, 51 (03): : 1473 - 1484
  • [26] Improving Human-Robot Interaction by Enhancing NAO Robot Awareness of Human Facial Expression
    Filippini, Chiara
    Perpetuini, David
    Cardone, Daniela
    Merla, Arcangelo
    [J]. SENSORS, 2021, 21 (19)
  • [27] Efficient facial expression recognition for human robot interaction
    Dornaika, Fadi
    Raducanu, Bogdan
    [J]. COMPUTATIONAL AND AMBIENT INTELLIGENCE, 2007, 4507 : 700 - +
  • [28] Enhanced Broad Siamese Network for Facial Emotion Recognition in Human-Robot Interaction
    Li, Yikai
    Zhang, Tong
    Philip Chen, C.L.
    [J]. IEEE Transactions on Artificial Intelligence, 2021, 2 (05): : 413 - 423
  • [29] Human-Robot Interaction Using Markovian Emotional Model Based on Facial Recognition
    Maeda, Yoichiro
    Geshi, Shotaro
    [J]. 2018 JOINT 10TH INTERNATIONAL CONFERENCE ON SOFT COMPUTING AND INTELLIGENT SYSTEMS (SCIS) AND 19TH INTERNATIONAL SYMPOSIUM ON ADVANCED INTELLIGENT SYSTEMS (ISIS), 2018, : 209 - 214
  • [30] Facial Communicative SignalsValence Recognition in Task-Oriented Human-Robot Interaction
    Christian Lang
    Sven Wachsmuth
    Marc Hanheide
    Heiko Wersing
    [J]. International Journal of Social Robotics, 2012, 4 : 249 - 262