Development of an Autonomous Sanding Robot with Structured-Light Technology

被引:0
|
作者
Huo, Yingxin [1 ]
Chen, Diancheng [1 ]
Li, Xiang [2 ,3 ]
Li, Peng [4 ]
Liu, Yun-Hui [1 ]
机构
[1] Chinese Univ Hong Kong, Dept Mech & Automat Engn, Hong Kong, Peoples R China
[2] Tsinghua Univ, Dept Automat, Beijing, Peoples R China
[3] CUHK Shenzhen Res Inst, Shenzhen, Peoples R China
[4] Harbin Inst Technol Shenzhen, Dept Mech Engn & Automat, Shenzhen, Peoples R China
关键词
IMPEDANCE CONTROL; FORCE;
D O I
10.1109/iros40897.2019.8968067
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Large demand for robotics and automation has been reflected in the sanding works, as current manual operations are labor-intensive, without consistent quality, and also subject to safety and health issues. While several machines have been developed to automate one or two steps in the sanding works, the autonomous capability of existing solutions is relatively low, and the human assistance or supervision is still heavily required in the calibration of target objects or the planning of robot motion and tasks. This paper presents the development of an autonomous sanding robot, which is able to perform the sanding works on an unknown object automatically, without any prior calibration or human intervention. The developed robot works as follows. First, the target object is scanned then modeled with the structured-light camera. Second, the robot motion is planned to cover all the surfaces of the object with an optimized transition sequence. Third, the robot is controlled to perform the sanding on the object under the desired impedance model. A prototype of the sanding robot is fabricated and its performance is validated in the task of sanding a batch of wooden boxes. With sufficient degrees of freedom (DOFs) and the customization of the end effector, the developed robot is able to provide a general solution to the autonomous sanding on many other different objects.
引用
收藏
页码:2855 / 2860
页数:6
相关论文
共 50 条
  • [1] Design of structured-light vision system for tomato harvesting robot
    Feng Qingchun
    Cheng Wei
    Zhou Jianjun
    Wang Xiu
    [J]. INTERNATIONAL JOURNAL OF AGRICULTURAL AND BIOLOGICAL ENGINEERING, 2014, 7 (02) : 19 - 26
  • [2] Object-Oriented Stripe Structured-Light Vision-Guided Robot
    Zhang, Liang
    Zhang, Jian-Zhou
    [J]. 2017 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN, AND CYBERNETICS (SMC), 2017, : 1592 - 1596
  • [3] Recognition of Initial Welding Position Based on Structured-Light for Arc Welding Robot
    Wang, Nianfeng
    Shi, Xiaodong
    Zhang, Xianmin
    [J]. INTELLIGENT ROBOTICS AND APPLICATIONS, ICIRA 2017, PT II, 2017, 10463 : 564 - 575
  • [4] Structured-light stereo: Comparative analysis and integration of structured-light and active stereo for measuring dynamic shape
    Jang, Wonkwi
    Je, Changsoo
    Seo, Yongduek
    Lee, Sang Wook
    [J]. OPTICS AND LASERS IN ENGINEERING, 2013, 51 (11) : 1255 - 1264
  • [5] Measuring method for micro-diameter based on structured-light vision technology
    Liu, Bin
    Wang, Peng
    Zeng, Yong
    Sun, Changku
    [J]. CHINESE OPTICS LETTERS, 2010, 8 (07) : 666 - 669
  • [6] Calibration of a structured-light projection system: Development to large dimension objects
    Leandry, I.
    Breque, C.
    Valle, V.
    [J]. OPTICS AND LASERS IN ENGINEERING, 2012, 50 (03) : 373 - 379
  • [7] Measuring method for micro-diameter based on structured-light vision technology
    刘斌
    王鹏
    曾勇
    孙长库
    [J]. Chinese Optics Letters, 2010, 8 (07) : 666 - 669
  • [8] Measuring method for asphalt pavement texture depth based on structured-light technology
    Li, W.
    Sun, Z.
    Hao, X.
    Feng, X.
    Zhao, H.
    [J]. Advances in Transportation Studies, 2015, 36 : 75 - 84
  • [9] Error of image saturation in the structured-light method
    Qi, Zhaoshuai
    Wang, Zhao
    Huang, Junhui
    Xing, Chao
    Gao, Jianmin
    [J]. APPLIED OPTICS, 2018, 57 (01) : A181 - A188
  • [10] Artificial intelligence for a structured-light imaging system
    Gale, TJ
    [J]. PROCEEDINGS OF THE IASTED INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND APPLICATIONS, VOLS 1AND 2, 2004, : 196 - 199