Multimodal fusion and human-robot interaction control of an intelligent robot

被引:1
|
作者
Gong, Tao [1 ]
Chen, Dan [1 ]
Wang, Guangping [2 ]
Zhang, Weicai [2 ]
Zhang, Junqi [2 ]
Ouyang, Zhongchuan [2 ]
Zhang, Fan [2 ]
Sun, Ruifeng [2 ]
Ji, Jiancheng Charles [1 ]
Chen, Wei [1 ]
机构
[1] Shenzhen Polytech Univ, Inst Intelligent Mfg, Shenzhen, Peoples R China
[2] AVIC Changhe Aircraft Ind Grp Corp Ltd, Jingdezhen, Peoples R China
关键词
kinematic modeling; robotic walker; multimodal fusion; human-robot interaction control; stroke; WALKER; STROKE;
D O I
10.3389/fbioe.2023.1310247
中图分类号
Q81 [生物工程学(生物技术)]; Q93 [微生物学];
学科分类号
071005 ; 0836 ; 090102 ; 100705 ;
摘要
Introduction: Small-scaled robotic walkers play an increasingly important role in Activity of Daily Living (ADL) assistance in the face of ever-increasing rehab requirements and existing equipment drawbacks. This paper proposes a Rehabilitation Robotic Walker (RRW) for walking assistance and body weight support (BWS) during gait rehabilitation.Methods: The walker provides the patients with weight offloading and guiding force to mimic a series of the physiotherapist's (PT's) movements, and creates a natural, comfortable, and safe environment. This system consists of an omnidirectional mobile platform, a BWS mechanism, and a pelvic brace to smooth the motions of the pelvis. To recognize the human intentions, four force sensors, two joysticks, and one depth-sensing camera were used to monitor the human-machine information, and a multimodal fusion algorithm for intention recognition was proposed to improve the accuracy. Then the system obtained the heading angle E, the pelvic pose F, and the motion vector H via the camera, the force sensors, and the joysticks respectively, classified the intentions with feature extraction and information fusion, and finally outputted the motor speed control through the robot's kinematics.Results: To validate the validity of the algorithm above, a preliminary test with three volunteers was conducted to study the motion control. The results showed that the average error of the integral square error (ISE) was 2.90 and the minimum error was 1.96.Discussion: The results demonstrated the efficiency of the proposed method, and that the system is capable of providing walking assistance.
引用
收藏
页数:10
相关论文
共 50 条
  • [1] Multimodal Information Fusion for Human-Robot Interaction
    Luo, Ren C.
    Wu, Y. C.
    Lin, P. H.
    [J]. 2015 IEEE 10TH JUBILEE INTERNATIONAL SYMPOSIUM ON APPLIED COMPUTATIONAL INTELLIGENCE AND INFORMATICS (SACI), 2015, : 535 - 540
  • [2] A HUMAN-ROBOT INTERACTION CONTROL ARCHITECTURE FOR AN INTELLIGENT ASSISTIVE ROBOT
    Ficocelli, Maurizio
    Nejat, Goldie
    Jhin, Greg Minseok
    [J]. PROCEEDINGS OF THE ASME INTERNATIONAL DESIGN ENGINEERING TECHNICAL CONFERENCES AND COMPUTERS AND INFORMATION IN ENGINEERING CONFERENCE, VOL 7, PTS A AND B, 2010, : 937 - 946
  • [3] Intelligent Control Architecture for Human-Robot Interaction
    Alves, Silas F. R.
    Silva, Ivan N.
    Ferasoli Filho, Humberto
    [J]. 2014 2ND BRAZILIAN ROBOTICS SYMPOSIUM (SBR) / 11TH LATIN AMERICAN ROBOTICS SYMPOSIUM (LARS) / 6TH ROBOCONTROL WORKSHOP ON APPLIED ROBOTICS AND AUTOMATION, 2014, : 259 - 264
  • [4] A unified multimodal control framework for human-robot interaction
    Cherubini, Andrea
    Passama, Robin
    Fraisse, Philippe
    Crosnier, Andre
    [J]. ROBOTICS AND AUTONOMOUS SYSTEMS, 2015, 70 : 106 - 115
  • [5] Human-robot interaction and robot control
    Sequeira, Joao
    Ribeiro, Maria Isabel
    [J]. ROBOT MOTION AND CONTROL: RECENT DEVELOPMENTS, 2006, 335 : 375 - 390
  • [6] Multimodal Fusion as Communicative Acts during Human-Robot Interaction
    Alonso-Martin, Fernando
    Gorostiza, Javier F.
    Malfaz, Maria
    Salichs, Miguel A.
    [J]. CYBERNETICS AND SYSTEMS, 2013, 44 (08) : 681 - 703
  • [7] Intelligent Speech Control System for Human-Robot Interaction
    Liu, Xiaomei
    Ge, Shuzhi Sam
    Jiang, Rui
    Goh, Cher-Hiang
    [J]. PROCEEDINGS OF THE 35TH CHINESE CONTROL CONFERENCE 2016, 2016, : 6154 - 6159
  • [8] Multimodal Interaction for Human-Robot Teams
    Burke, Dustin
    Schurr, Nathan
    Ayers, Jeanine
    Rousseau, Jeff
    Fertitta, John
    Carlin, Alan
    Dumond, Danielle
    [J]. UNMANNED SYSTEMS TECHNOLOGY XV, 2013, 8741
  • [9] An Intelligent Human-Robot Interaction Framework to Control the Human Attention
    Hoque, Mohammed Moshiul
    Deb, Kaushik
    Das, Dipankar
    Kobayashi, Yoshinori
    Kuno, Yoshinori
    [J]. 2013 INTERNATIONAL CONFERENCE ON INFORMATICS, ELECTRONICS & VISION (ICIEV), 2013,
  • [10] Designing a Multimodal Human-Robot Interaction Interface for an Industrial Robot
    Mocan, Bogdan
    Fulea, Mircea
    Brad, Stelian
    [J]. ADVANCES IN ROBOT DESIGN AND INTELLIGENT CONTROL, 2016, 371 : 255 - 263