A multimodal human-robot sign language interaction framework applied in social robots

被引:3
|
作者
Li, Jie [1 ]
Zhong, Junpei [2 ]
Wang, Ning [3 ]
机构
[1] Chongqing Technol & Business Univ, Sch Artificial Intelligence, Chongqing, Peoples R China
[2] Hong Kong Polytech Univ, Dept Rehabil Sci, Hong Kong, Peoples R China
[3] Univ West England, Bristol Robot Lab, Bristol, England
关键词
social robots; sign language; gesture recognition; multimodal sensors; human-robot interaction; HAND GESTURE RECOGNITION; EMG FEATURE EVALUATION; VISION;
D O I
10.3389/fnins.2023.1168888
中图分类号
Q189 [神经科学];
学科分类号
071006 ;
摘要
Deaf-mutes face many difficulties in daily interactions with hearing people through spoken language. Sign language is an important way of expression and communication for deaf-mutes. Therefore, breaking the communication barrier between the deaf-mute and hearing communities is significant for facilitating their integration into society. To help them integrate into social life better, we propose a multimodal Chinese sign language (CSL) gesture interaction framework based on social robots. The CSL gesture information including both static and dynamic gestures is captured from two different modal sensors. A wearable Myo armband and a Leap Motion sensor are used to collect human arm surface electromyography (sEMG) signals and hand 3D vectors, respectively. Two modalities of gesture datasets are preprocessed and fused to improve the recognition accuracy and to reduce the processing time cost of the network before sending it to the classifier. Since the input datasets of the proposed framework are temporal sequence gestures, the long-short term memory recurrent neural network is used to classify these input sequences. Comparative experiments are performed on an NAO robot to test our method. Moreover, our method can effectively improve CSL gesture recognition accuracy, which has potential applications in a variety of gesture interaction scenarios not only in social robots.
引用
收藏
页数:15
相关论文
共 50 条
  • [1] A Multimodal Human-Robot Interaction Manager for Assistive Robots
    Abbasi, Bahareh
    Monaikul, Natawut
    Rysbek, Zhanibek
    Di Eugenio, Barbara
    Zefran, Milos
    [J]. 2019 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2019, : 6756 - 6762
  • [2] A Multimodal Emotional Human-Robot Interaction Architecture for Social Robots Engaged in Bidirectional Communication
    Hong, Alexander
    Lunscher, Nolan
    Hu, Tianhao
    Tsuboi, Yuma
    Zhang, Xinyi
    Alves, Silas Franco dos Reis
    Nejat, Goldie
    Benhabib, Beno
    [J]. IEEE TRANSACTIONS ON CYBERNETICS, 2021, 51 (12) : 5954 - 5968
  • [3] A unified multimodal control framework for human-robot interaction
    Cherubini, Andrea
    Passama, Robin
    Fraisse, Philippe
    Crosnier, Andre
    [J]. ROBOTICS AND AUTONOMOUS SYSTEMS, 2015, 70 : 106 - 115
  • [4] Assistive Robots for the Social Management of Health: A Framework for Robot Design and Human-Robot Interaction Research
    Chita-Tegmark, Meia
    Scheutz, Matthias
    [J]. INTERNATIONAL JOURNAL OF SOCIAL ROBOTICS, 2021, 13 (02) : 197 - 217
  • [5] Empathy in Human-Robot Interaction: Designing for Social Robots
    Park, Sung
    Whang, Mincheol
    [J]. INTERNATIONAL JOURNAL OF ENVIRONMENTAL RESEARCH AND PUBLIC HEALTH, 2022, 19 (03)
  • [6] Empowering robots with social cues: an initiative pose control framework for human-robot interaction
    Zhang, Shuai
    Duan, Xiaoting
    Zhu, Gancheng
    Li, You
    Huang, Zehao
    Li, Yongkai
    Wang, Rong
    Wang, Zhiguo
    [J]. INTELLIGENT SERVICE ROBOTICS, 2024, 17 (05) : 1005 - 1017
  • [7] Real-time Framework for Multimodal Human-Robot Interaction
    Gast, Juergen
    Bannat, Alexander
    Rehrl, Tobias
    Wallhoff, Frank
    Rigoll, Gerhard
    Wendt, Cornelia
    Schmidt, Sabrina
    Popp, Michael
    Faerber, Berthold
    [J]. HSI: 2009 2ND CONFERENCE ON HUMAN SYSTEM INTERACTIONS, 2009, : 273 - 280
  • [8] Social Human-Robot Interaction of Human-care Service Robots
    Ahn, Ho Seok
    Choi, JongSuk
    Moon, Hyungpil
    Lim, Yoonseob
    [J]. COMPANION OF THE 2018 ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION (HRI'18), 2018, : 385 - 386
  • [9] A Survey on Perception Methods for Human-Robot Interaction in Social Robots
    Yan, Haibin
    Ang, Marcelo H., Jr.
    Poo, Aun Neow
    [J]. INTERNATIONAL JOURNAL OF SOCIAL ROBOTICS, 2014, 6 (01) : 85 - 119
  • [10] Social Human-Robot Interaction of Human-care Service Robots
    Ahn, Ho Seok
    Choi, JongSuk
    Moon, Hyungpil
    Jang, Minsu
    Kwak, Sonya S.
    Lim, Yoonseob
    [J]. HRI '19: 2019 14TH ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION, 2019, : 698 - 699