A mixed perception-based human-robot collaborative maintenance approach driven by augmented reality and online deep reinforcement learning

被引:14
|
作者
Liu, Changchun [1 ]
Zhang, Zequn [1 ]
Tang, Dunbing [1 ]
Nie, Qingwei [1 ]
Zhang, Linqi [1 ]
Song, Jiaye [1 ]
机构
[1] Nanjing Univ Aeronaut & Astronaut, Coll Mech & Elect Engn, Nanjing 210016, Peoples R China
基金
中国国家自然科学基金;
关键词
Human -robot collaborative maintenance; Mixed perception; Decision-making; Online deep reinforcement learning; Augmented reality; HAND GESTURE RECOGNITION; NEURAL-NETWORK; PREDICTION; SELECTION;
D O I
10.1016/j.rcim.2023.102568
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Owing to the fact that the number and complexity of machines is increasing in Industry 4.0, the maintenance process is more time-consuming and labor-intensive, which contains plenty of refined maintenance operations. Fortunately, human-robot collaboration (HRC) can integrate human intelligence into the collaborative robot (cobot), which can realize not merely the nimble and sapiential maintenance operations of personnel but also the reliable and repeated maintenance manipulation of cobots. However, the existing HRC maintenance lacks the precise understand of the maintenance intention, the efficient HRC decision-making for executing robotized maintenance tasks (e.g., repetitive manual tasks) and the convenient interaction interface for executing cognitive tasks (e.g., maintenance preparation and guidance job). Hence, a mixed perception-based human-robot collab-orative maintenance approach consisting of three-hierarchy structures is proposed in this paper, which can help reduce the severity of the mentioned problems. In the first stage, a mixed perception module is proposed to help the cobot recognize human safety and maintenance request according to human actions and gestures separately. During the second stage, an improved online deep reinforcement learning (DRL)-enabled decision-making module with the asynchronous structure and the function of anti-disturbance is proposed in this paper, which can realize the execution of robotized maintenance tasks. In the third stage, an augmented reality-assisted (AR) user-friendly interaction interface is designed to help the personnel interact with the cobot and execute the auxiliary maintenance task without the limitation of spatial and human factors. In addition, the auxiliary of maintenance operation can also be supported by the AR-assisted visible guidance. Finally, comparative nu-merical experiments are implemented in a typical machining workshop, and the experimental results show a competitive performance of the proposed HRC maintenance approach compared with other state-of-the-art methods.
引用
收藏
页数:24
相关论文
共 50 条
  • [22] Human-robot collaboration: A literature review and augmented reality approach in design
    Department of Mechanical Engineering, University of Canterbury, Christchurch, New Zealand
    不详
    Int. J. Adv. Rob. Syst., 2008, 1 (1-18):
  • [23] Human-Robot Collaboration: A Literature Review and Augmented Reality Approach In Design
    Green, Scott A.
    Billinghurst, Mark
    Chen, XiaoQi
    Chase, J. Geoffrey
    INTERNATIONAL JOURNAL OF ADVANCED ROBOTIC SYSTEMS, 2008, 5 (01) : 1 - 18
  • [24] Deep Learning of Augmented Reality based Human Interactions for Automating a Robot Team
    Dias, Adhitha
    Wellaboda, Hasitha
    Rasanka, Yasod
    Munasinghe, Menusha
    Rodrigo, Ranga
    Jayasekara, Peshala
    2020 6TH INTERNATIONAL CONFERENCE ON CONTROL, AUTOMATION AND ROBOTICS (ICCAR), 2020, : 175 - 182
  • [25] Deep Reinforcement Learning with Interactive Feedback in a Human-Robot Environment
    Moreira, Ithan
    Rivas, Javier
    Cruz, Francisco
    Dazeley, Richard
    Ayala, Angel
    Fernandes, Bruno
    APPLIED SCIENCES-BASEL, 2020, 10 (16):
  • [26] Human-robot force cooperation analysis by deep reinforcement learning
    Li, Shaodong
    Yuan, Xiaogang
    Yu, Hongjian
    INDUSTRIAL ROBOT-THE INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH AND APPLICATION, 2023, 50 (02): : 287 - 298
  • [27] Virtual, Augmented, and Mixed Reality for Human-Robot Interaction (VAM-HRI)
    Williams, Tom
    Szafir, Daniel
    Chakraborti, Tathagata
    Phillips, Elizabeth
    HRI '19: 2019 14TH ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION, 2019, : 671 - 672
  • [28] Virtual, Augmented, and Mixed Reality for Human-Robot Interaction (VAM-HRI)
    Wozniak, Maciej
    Chang, Christine T.
    Luebbers, Matthew B.
    Ikeda, Bryce
    Walker, Michael
    Rosen, Eric
    Groechel, Thomas Roy
    COMPANION OF THE ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION, HRI 2023, 2023, : 938 - 940
  • [29] Virtual, Augmented, and Mixed Reality for Human-Robot Interaction (VAM-HRI)
    Williams, Tom
    Szafir, Daniel
    Chakraborti, Tathagata
    Khim, Ong Soh
    Rosen, Eric
    Booth, Serena
    Groechel, Thomas
    HRI'20: COMPANION OF THE 2020 ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION, 2020, : 663 - 664
  • [30] Virtual, Augmented, and Mixed Reality for Human-Robot Interaction (VAM-HRI)
    Rosen, Eric
    Groechel, Thomas
    Walker, Michael E.
    Chang, Christine T.
    Forde, Jessica Zosa
    HRI '21: COMPANION OF THE 2021 ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION, 2021, : 721 - 723