Multi-Step Object Extraction Planning From Clutter Based on Support Relations

被引:0
|
作者
Motoda, Tomohiro [1 ]
Petit, Damien [1 ]
Nishi, Takao [1 ]
Nagata, Kazuyuki [2 ]
Wan, Weiwei [1 ]
Harada, Kensuke [1 ,3 ]
机构
[1] Osaka Univ, Grad Sch Engn Sci, Osaka 5608531, Japan
[2] Reitaku Univ, Future Engn Res Ctr, Chiba 2778686, Japan
[3] Natl Inst Adv Ind Sci & Technol, Ind CPS Res Ctr, Tokyo 1350064, Japan
基金
日本学术振兴会;
关键词
Robots; Clutter; Planning; Visualization; Logistics; Heating systems; Deep learning; Automation; Manufacturing automation; Manipulators; Deep learning in grasping and manipulation; logistics; factory automation; manipulation planning; bimanual manipulation; PREDICTION; LOCATION; PICKING;
D O I
10.1109/ACCESS.2023.3273289
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
To automate operations in a logistic warehouse, a robot needs to extract items from the clutter on a shelf without collapsing the clutter. To address this problem, this study proposes a multi-step motion planner to stably extract an item by using the support relations of each object included in the clutter. This study primarily focuses on safe extraction, which allows the robot to choose the best next action based on limited observations. By estimating the support relations, we construct a collapse prediction graph to obtain the appropriate order of object extraction. Thus, the target object can be extracted without collapsing the pile. Furthermore, we show that the efficiency of the robot is improved if it uses one of its arms to extract the target object while the other supports a neighboring object. The proposed method is evaluated in real-world experiments on detecting support relations and object extraction tasks. This study makes a significant contribution because the experimental results indicate that the robot can estimate support relations based on collapse predictions and perform safe extraction in real environments. Our multi-step extraction plan ensures both better performance and robustness to achieve safe object extraction tasks from the clutter.
引用
收藏
页码:45129 / 45139
页数:11
相关论文
共 50 条
  • [1] Mechanical Search: Multi-Step Retrieval of a Target Object Occluded by Clutter
    Danielczuk, Michael
    Kurenkov, Audrey
    Balakrishna, Ashwin
    Matl, Matthew
    Wang, David
    Martin-Martin, Roberto
    Garg, Animesh
    Savarese, Silvio
    Goldberg, Ken
    [J]. 2019 INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), 2019, : 1614 - 1621
  • [2] Multi-step planning in the brain
    Miller, Kevin J.
    Venditto, Sarah Jo C.
    [J]. CURRENT OPINION IN BEHAVIORAL SCIENCES, 2021, 38 : 29 - 39
  • [3] On multi-step BCFW recursion relations
    Bo Feng
    Junjie Rao
    Kang Zhou
    [J]. Journal of High Energy Physics, 2015
  • [4] On multi-step BCFW recursion relations
    Feng, Bo
    Rao, Junjie
    Zhou, Kang
    [J]. JOURNAL OF HIGH ENERGY PHYSICS, 2015, (07):
  • [5] Multi-Step Planning for Robotic Manipulation
    Pflueger, Max
    Sukhatme, Gaurav S.
    [J]. 2015 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), 2015, : 2496 - 2501
  • [6] Inverting the planning gradient: adjustment of grasps to late segments of multi-step object manipulations
    Hanna Mathew
    Wilfried Kunde
    Oliver Herbort
    [J]. Experimental Brain Research, 2017, 235 : 1397 - 1409
  • [7] Inverting the planning gradient: adjustment of grasps to late segments of multi-step object manipulations
    Mathew, Hanna
    Kunde, Wilfried
    Herbort, Oliver
    [J]. EXPERIMENTAL BRAIN RESEARCH, 2017, 235 (05) : 1397 - 1409
  • [8] Multi-step entropy based sensor control for visual object tracking
    Deutsch, B
    Zobel, M
    Denzler, J
    Niemann, H
    [J]. PATTERN RECOGNITION, 2004, 3175 : 359 - 366
  • [9] Multi-step multi-camera view planning for real-time visual object tracking
    Deutsch, Benjamin
    Wenhardt, Stefan
    Niemann, Heinrich
    [J]. PATTERN RECOGNITION, PROCEEDINGS, 2006, 4174 : 536 - 545
  • [10] Learning-assisted multi-step planning
    Hauser, K
    Bretl, T
    Latombe, JC
    [J]. 2005 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), VOLS 1-4, 2005, : 4575 - 4580