A virtual-physical collision detection interface for AR-based interactive teaching of robot

被引:23
|
作者
Chen, Chengjun [1 ]
Pan, Yong [1 ]
Li, Dongnian [1 ]
Zhang, Shilei [1 ]
Zhao, Zhengxu [1 ]
Hong, Jun [2 ]
机构
[1] Qingdao Univ Technol, Sch Mech & Automot Engn, Qingdao 266000, Shandong, Peoples R China
[2] Xi An Jiao Tong Univ, Sch Mech Engn, Xian 710049, Shanxi, Peoples R China
关键词
Robot path planning; Teaching programming; Augmented reality; Virtual-physical collision detection; Depth image; AUGMENTED REALITY;
D O I
10.1016/j.rcim.2020.101948
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
At present, online lead-through and offline programming methods are widely used in programming of industrial robots. However, both methods have some drawbacks for unskilled shopworkers. This paper presents an Augmented Reality (AR)-based interactive robot teaching programming system, which virtually projected the robot onto the physical industrial environment. The unskilled shopworkers can use Handheld Teaching Device (HTD) to move end-effector of virtual robot to follow endpoint of the HTD. In this way, the path of the virtual robot can be planned or tested interactively. In addition, collisions detection between virtual robot and physical environment is key to test the feasibility of robot path. So, a method for detecting virtual-physical collisions is presented in this paper by comparing the depth values of corresponding pixels in depth image acquired by Kinect and computer-generated image in order to get collision-free paths of the virtual robot. The Quadtree model is used to accelerate the collision detection process and get distance between virtual model and physical environment. Using the AR-based interactive robot teaching programming system presented in this paper, all workers even unskilled ones in robot programming, can quickly and effectively get the collision-free robot path.
引用
收藏
页数:10
相关论文
共 50 条
  • [1] Novel AR-based interface for human-robot interaction and visualization
    H.C.Fang
    S.K.Ong
    A.Y.C.Nee
    [J]. Advances in Manufacturing, 2014, 2 (04) : 275 - 288
  • [2] Novel AR-based interface for human-robot interaction and visualization
    H. C. Fang
    S. K. Ong
    A. Y. C. Nee
    [J]. Advances in Manufacturing, 2014, 2 : 275 - 288
  • [3] Novel AR-based interface for human-robot interaction and visualization
    Fang, H.
    Ong, S.
    Nee, A.
    [J]. ADVANCES IN MANUFACTURING, 2014, 2 (04) : 275 - 288
  • [4] Development of a 3D AR-Based Interface for Industrial Robot Manipulators
    Su, Yu-Hsuan
    Chen, Chun-Yung
    Cheng, Shu-Ling
    Ko, Chun-Hsu
    Young, Kuu-Young
    [J]. 2018 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN, AND CYBERNETICS (SMC), 2018, : 1809 - 1814
  • [5] ARIAS: An AR-based interactive advertising system
    Wang, Qiujiao
    Xie, Zhijie
    [J]. PLOS ONE, 2023, 18 (09):
  • [6] Teaching Chemistry with Arduino Experiments in a Mixed Virtual-Physical Learning Environment
    N. Papadimitropoulos
    K. Dalacosta
    E. A. Pavlatou
    [J]. Journal of Science Education and Technology, 2021, 30 : 550 - 566
  • [7] Design of an AR-based virtual painting system: AR-painter
    Weng, DD
    Liu, Y
    Wang, YT
    [J]. FIFTH INTERNATIONAL SYMPOSIUM ON INSTRUMENTATION AND CONTROL TECHNOLOGY, 2003, 5253 : 853 - 856
  • [8] The ARTable: An AR-Based tangible user interface system
    Park, Youngmin
    Woo, Woontack
    [J]. TECHNOLOGIES FOR E-LEARNING AND DIGITAL ENTERTAINMENT, PROCEEDINGS, 2006, 3942 : 1198 - 1207
  • [9] Consumers' adoption of AR-based virtual fitting rooms: from the perspective of theory of interactive media effects
    Lee, Hanna
    Xu, Yingjiao
    Porterfield, Anne
    [J]. JOURNAL OF FASHION MARKETING AND MANAGEMENT, 2021, 25 (01) : 45 - 62
  • [10] A Color Sensing AR-Based Interactive Learning System for Kids
    Mahmoudi, Maryam Tayefeh
    Zeraati, Farnaz Zamiri
    Yassini, Parham
    [J]. 2018 12TH IRANIAN AND 6TH INTERNATIONAL CONFERENCE ON E-LEARNING AND E-TEACHING (ICELET), 2018, : 13 - 20