A mixed perception-based human-robot collaborative maintenance approach driven by augmented reality and online deep reinforcement learning

被引:14
|
作者
Liu, Changchun [1 ]
Zhang, Zequn [1 ]
Tang, Dunbing [1 ]
Nie, Qingwei [1 ]
Zhang, Linqi [1 ]
Song, Jiaye [1 ]
机构
[1] Nanjing Univ Aeronaut & Astronaut, Coll Mech & Elect Engn, Nanjing 210016, Peoples R China
基金
中国国家自然科学基金;
关键词
Human -robot collaborative maintenance; Mixed perception; Decision-making; Online deep reinforcement learning; Augmented reality; HAND GESTURE RECOGNITION; NEURAL-NETWORK; PREDICTION; SELECTION;
D O I
10.1016/j.rcim.2023.102568
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Owing to the fact that the number and complexity of machines is increasing in Industry 4.0, the maintenance process is more time-consuming and labor-intensive, which contains plenty of refined maintenance operations. Fortunately, human-robot collaboration (HRC) can integrate human intelligence into the collaborative robot (cobot), which can realize not merely the nimble and sapiential maintenance operations of personnel but also the reliable and repeated maintenance manipulation of cobots. However, the existing HRC maintenance lacks the precise understand of the maintenance intention, the efficient HRC decision-making for executing robotized maintenance tasks (e.g., repetitive manual tasks) and the convenient interaction interface for executing cognitive tasks (e.g., maintenance preparation and guidance job). Hence, a mixed perception-based human-robot collab-orative maintenance approach consisting of three-hierarchy structures is proposed in this paper, which can help reduce the severity of the mentioned problems. In the first stage, a mixed perception module is proposed to help the cobot recognize human safety and maintenance request according to human actions and gestures separately. During the second stage, an improved online deep reinforcement learning (DRL)-enabled decision-making module with the asynchronous structure and the function of anti-disturbance is proposed in this paper, which can realize the execution of robotized maintenance tasks. In the third stage, an augmented reality-assisted (AR) user-friendly interaction interface is designed to help the personnel interact with the cobot and execute the auxiliary maintenance task without the limitation of spatial and human factors. In addition, the auxiliary of maintenance operation can also be supported by the AR-assisted visible guidance. Finally, comparative nu-merical experiments are implemented in a typical machining workshop, and the experimental results show a competitive performance of the proposed HRC maintenance approach compared with other state-of-the-art methods.
引用
收藏
页数:24
相关论文
共 50 条
  • [31] Virtual, Augmented, and Mixed Reality for Human-Robot Interaction (VAM-HRI)
    Wozniak, Maciej K.
    Pascher, Max
    Ikeda, Bryce
    Luebbers, Matthew B.
    Jena, Ayesha
    COMPANION OF THE 2024 ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION, HRI 2024 COMPANION, 2024, : 1361 - 1363
  • [32] Affect-Driven Learning of Robot Behaviour for Collaborative Human-Robot Interactions
    Churamani, Nikhil
    Barros, Pablo
    Gunes, Hatice
    Wermter, Stefan
    FRONTIERS IN ROBOTICS AND AI, 2022, 9
  • [33] Augmented reality user interface design and experimental evaluation for human-robot collaborative assembly
    Chu, Chih-Hsing
    Liu, Yu-Lun
    JOURNAL OF MANUFACTURING SYSTEMS, 2023, 68 : 313 - 324
  • [34] "The Robot-Arm Talks Back to Me" - Human Perception of Augmented Human-Robot Collaboration in Virtual Reality
    Arntz, Alexander
    Eimler, Sabrina C.
    Hoppe, H. Ulrich
    2020 IEEE INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND VIRTUAL REALITY (AIVR 2020), 2020, : 307 - 312
  • [35] Assisted Human-Robot Interaction for Industry Application Based Augmented Reality
    Fang, Haonan
    Wen, Jingqian
    Yang, XiaoNan
    Wang, Peng
    Li, Yinqian
    VIRTUAL, AUGMENTED AND MIXED REALITY: APPLICATIONS IN EDUCATION, AVIATION AND INDUSTRY, PT II, 2022, 13318 : 291 - 301
  • [36] A human-robot interaction system for navigation supervision based on augmented reality
    Nunez, P.
    Bandera, J. P.
    Perez-Lorenzo, J. M.
    Sandoval, F.
    CIRCUITS AND SYSTEMS FOR SIGNAL PROCESSING , INFORMATION AND COMMUNICATION TECHNOLOGIES, AND POWER SOURCES AND SYSTEMS, VOL 1 AND 2, PROCEEDINGS, 2006, : 441 - 444
  • [37] A Framework for an Adaptive Human-Robot Collaboration approach through Perception-based Real-Time adjustments of robot behavior in industry
    Kumar, Shitij
    Sahin, Ferat
    2017 12TH SYSTEM OF SYSTEMS ENGINEERING CONFERENCE (SOSE), 2017,
  • [38] Robot learning assisted by perception-based information: A computing with words approach
    Zhou, CJ
    IROS 2001: PROCEEDINGS OF THE 2001 IEEE/RJS INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS, VOLS 1-4: EXPANDING THE SOCIETAL ROLE OF ROBOTICS IN THE NEXT MILLENNIUM, 2001, : 272 - 277
  • [39] Report on the First International Workshop on Virtual, Augmented and Mixed Reality for Human-Robot Interaction
    Williams, Tom
    Szafir, Daniel
    Chakraborti, Tathagata
    Ben Amor, Heni
    AI MAGAZINE, 2018, 39 (04) : 64 - 66
  • [40] Towards Safe Human-Robot Collaboration Using Deep Reinforcement Learning
    El-Shamouty, Mohamed
    Wu, Xinyang
    Yang, Shanqi
    Albus, Marcel
    Huber, Marco F.
    2020 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), 2020, : 4899 - 4905