sEMG-based Static Force Estimation for Human-Robot Interaction using Deep Learning

被引:0
|
作者
Kim, Sejin [1 ]
Chung, Wan Kyun [2 ]
Kim, Keehoon [2 ]
机构
[1] POSTECH, Robot Lab, Sch Mech Engn, Pohang, South Korea
[2] POSTECH, Fac Mech Engn, Pohang, South Korea
基金
新加坡国家研究基金会;
关键词
ISOMETRIC TORQUE; SIGNAL; EMG;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Human-robot interaction (HRI) is a rapidly growing research area and it occurs in many applications including human-robot collaboration, human power augmentation, and rehabilitation robotics. As it is hard to exactly calculate intended motion trajectory, generally, interaction control is applied in HRI instead of pure motion control. To implement the interaction control, force information is necessary and force sensor is widely used in force feedback. However, force sensor has some limitations as 1) it is subject to breakdown, 2) it imposes additional volume and weight to the system, and 3) its applicable places are constrained. In this situation, force estimation can be a good solution. However, if force in static situation should be measured, using position and velocity is not sufficient because they are not influenced by the exerted force anymore. Therefore, we proposed sEMG-based static force estimation using deep learning. sEMG provides a useful information about human-exerting force because it reflects the human intention. Also, to extract the complex relationship between sEMG and force, deep learning approach is used. Experimental results show that when force with maximal value of 63.2 N is exerted, average force estimation error was 3.67 N. Also, the proposed method shows that force onset timing of estimated force is faster than force sensor signal. This result would be advantageous for faster human intention recognition.
引用
下载
收藏
页码:81 / 86
页数:6
相关论文
共 50 条
  • [41] sEMG-based deep learning framework for the automatic detection of knee abnormality
    Ankit Vijayvargiya
    Bharat Singh
    Nidhi Kumari
    Rajesh Kumar
    Signal, Image and Video Processing, 2023, 17 : 1087 - 1095
  • [42] sEMG-based deep learning framework for the automatic detection of knee abnormality
    Vijayvargiya, Ankit
    Singh, Bharat
    Kumari, Nidhi
    Kumar, Rajesh
    SIGNAL IMAGE AND VIDEO PROCESSING, 2023, 17 (04) : 1087 - 1095
  • [43] Constructive learning for human-robot interaction
    Singh, Amarjot
    Karanam, Srikrishna
    Kumar, Devinder
    IEEE Potentials, 2013, 32 (04): : 13 - 19
  • [44] Online learning for human-robot interaction
    Raducanu, Bogdan
    Vitria, Jordi
    2007 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, VOLS 1-8, 2007, : 3342 - +
  • [45] Design of Elbow Rehabilitation Exoskeleton Robot with sEMG-based Torque Estimation Control Strategy
    Yang, Nachuan
    Li, Juncheng
    Xu, Pengpeng
    Zeng, Ziniu
    Cai, Siqi
    Xie, Longhan
    2022 6TH INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION SCIENCES (ICRAS 2022), 2022, : 105 - 113
  • [46] FORCE/POSITION CONTROL OF ROBOT MANIPULATOR FOR HUMAN-ROBOT INTERACTION
    Neranon, Paramin
    Bicker, Robert
    THERMAL SCIENCE, 2016, 20 : S537 - S548
  • [47] sEMG-based continuous estimation of joint angles of human legs by using BP neural network
    Zhang, Feng
    Li, Pengfeng
    Hou, Zeng-Guang
    Lu, Zhen
    Chen, Yixiong
    Li, Qingling
    Tan, Min
    NEUROCOMPUTING, 2012, 78 (01) : 139 - 148
  • [48] Skill Learning for Human-Robot Interaction Using Wearable Device
    Bin Fang
    Xiang Wei
    Fuchun Sun
    Haiming Huang
    Yuanlong Yu
    Huaping Liu
    Tsinghua Science and Technology, 2019, 24 (06) : 654 - 662
  • [49] Skill Learning for Human-Robot Interaction Using Wearable Device
    Fang, Bin
    Wei, Xiang
    Sun, Fuchun
    Huang, Haiming
    Yu, Yuanlong
    Liu, Huaping
    TSINGHUA SCIENCE AND TECHNOLOGY, 2019, 24 (06) : 654 - 662
  • [50] Optimized Assistive Human-Robot Interaction Using Reinforcement Learning
    Modares, Hamidreza
    Ranatunga, Isura
    Lewis, Frank L.
    Popa, Dan O.
    IEEE TRANSACTIONS ON CYBERNETICS, 2016, 46 (03) : 655 - 667