An Accurate Prediction Method of Human Assembly Motion for Human-Robot Collaboration

被引:0
|
作者
Zhou, Yangzheng [1 ]
Luo, Liang [2 ]
Li, Pengzhong [1 ]
机构
[1] Tongji Univ, Sch Mech Engn, Shanghai 200092, Peoples R China
[2] Tongji Univ, Sino German Coll Postgrad Studies, Shanghai 200092, Peoples R China
来源
SYMMETRY-BASEL | 2024年 / 16卷 / 01期
关键词
human motion prediction; human-robot collaboration; collaborative assembly; real time; TRAJECTORY PREDICTION;
D O I
10.3390/sym16010118
中图分类号
O [数理科学和化学]; P [天文学、地球科学]; Q [生物科学]; N [自然科学总论];
学科分类号
07 ; 0710 ; 09 ;
摘要
In the process of human-robot collaborative assembly, robots need to recognize and predict human behaviors accurately, and then perform autonomous control and work route planning in real-time. To support the judgment of human intervention behaviors and meet the need of real-time human-robot collaboration, the Fast Spatial-Temporal Transformer Network (FST-Trans), an accurate prediction method of human assembly actions, is proposed. We tried to maximize the symmetry between the prediction results and the actual action while meeting the real-time requirement. With concise and efficient structural design, FST-Trans can learn about the spatial-temporal interactions of human joints during assembly in the same latent space and capture more complex motion dynamics. Considering the inconsistent assembly rates of different individuals, the network is forced to learn more motion variations by introducing velocity-acceleration loss, realizing accurate prediction of assembly actions. An assembly dataset was collected and constructed for detailed comparative experiments and ablation studies, and the experimental results demonstrate the effectiveness of the proposed method.
引用
下载
收藏
页数:15
相关论文
共 50 条
  • [41] A New Human-Robot Collaboration Assembly System for Cellular Manufacturing
    Duan, Feng
    Tan, Jeffrey Too Chuan
    Arai, Tamio
    2011 30TH CHINESE CONTROL CONFERENCE (CCC), 2011, : 5468 - 5473
  • [42] Discovering Action Primitive Granularity from Human Motion for Human-Robot Collaboration
    Grigore, Elena Corina
    Scassellati, Brian
    ROBOTICS: SCIENCE AND SYSTEMS XIII, 2017,
  • [43] Human-robot Compliant Collaboration Based on Feedback of Motion Intention of Human Arm
    Huang Y.
    Chen K.
    Wang K.
    Yang L.
    Zhang X.
    Jiqiren/Robot, 2021, 43 (02): : 148 - 155
  • [44] Human-robot collaboration assembly personalised safety detection method based on digital twin
    Luo, Guofu
    Yan, Luyao
    Wang, Haoqi
    Li, Hao
    Wen, Xiaoyu
    Liu, Gen
    Zhang, Yuyan
    Sun, Chunya
    Yang, Wenchao
    Xing, Hongwen
    JOURNAL OF ENGINEERING DESIGN, 2024, 35 (09) : 1102 - 1124
  • [45] Human-robot collaboration: A survey
    Bauer, Andrea
    Wollherr, Dirk
    Buss, Martin
    INTERNATIONAL JOURNAL OF HUMANOID ROBOTICS, 2008, 5 (01) : 47 - 66
  • [46] Explainability for Human-Robot Collaboration
    Yadollahi, Elmira
    Romeo, Marta
    Dogan, Fethiye Irmak
    Johal, Wafa
    De Graaf, Maartje
    Levy-Tzedek, Shelly
    Leite, Iolanda
    COMPANION OF THE 2024 ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION, HRI 2024 COMPANION, 2024, : 1364 - 1366
  • [47] Force tracking control for motion synchronization in human-robot collaboration
    Li, Yanan
    Ge, Shuzhi Sam
    ROBOTICA, 2016, 34 (06) : 1260 - 1281
  • [48] Human-Robot Collaboration: A Survey
    Chandrasekaran, Balasubramaniyan
    Conrad, James M.
    IEEE SOUTHEASTCON 2015, 2015,
  • [49] Motion Planning for Human-Robot Collaboration based on Reinforcement Learning
    Yu, Tian
    Chang, Qing
    2022 IEEE 18TH INTERNATIONAL CONFERENCE ON AUTOMATION SCIENCE AND ENGINEERING (CASE), 2022, : 1866 - 1871
  • [50] PoseTron: Enabling Close-Proximity Human-Robot Collaboration Through Multi-human Motion Prediction
    Yasar, Mohammad Samin
    Islam, Md Mofijul
    Iqbal, Tariq
    PROCEEDINGS OF THE 2024 ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION, HRI 2024, 2024, : 830 - 839