Learning Robotic Manipulation through Visual Planning and Acting

被引:0
|
作者
Wang, Angelina [1 ]
Kurutach, Thanard [1 ]
Liu, Kara [1 ]
Abbeel, Pieter [1 ]
Tamar, Aviv [2 ]
机构
[1] Univ Calif Berkeley, EECS Dept, Berkeley, CA 94720 USA
[2] Technion, Dept Elect Engn, Haifa, Israel
关键词
D O I
暂无
中图分类号
TP24 [机器人技术];
学科分类号
080202 ; 1405 ;
摘要
Planning for robotic manipulation requires reasoning about the changes a robot can affect on objects. When such interactions can be modelled analytically, as in domains with rigid objects, efficient planning algorithms exist. However, in both domestic and industrial domains, the objects of interest can be soft, or deformable, and hard to model analytically. For such cases, we posit that a data-driven modelling approach is more suitable. In recent years, progress in deep generative models has produced methods that learn to 'imagine' plausible images from data. Building on the recent Causal InfoGAN generative model, in this work we learn to imagine goal-directed object manipulation directly from raw image data of self-supervised interaction of the robot with the object. After learning, given a goal observation of the system, our model can generate an imagined plan - a sequence of images that transition the object into the desired goal. To execute the plan, we use it as a reference trajectory to track with a visual servoing controller, which we also learn from the data as an inverse dynamics model. In a simulated manipulation task, we show that separating the problem into visual planning and visual tracking control is more sample efficient and more interpretable than alternative data-driven approaches. We further demonstrate our approach on learning to imagine and execute in 3 environments, the final of which is deformable rope manipulation on a PR2 robot.
引用
下载
收藏
页数:10
相关论文
共 50 条
  • [31] Learning Operators for Manipulation Planning
    Burbridge, Chris
    Saigol, Zeyn
    Schmidt, Florian
    Borst, Christoph
    Dearden, Richard
    2012 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2012, : 686 - 693
  • [32] APPLICATION OF LEARNING TO ROBOTIC PLANNING
    TANGWONGSAN, S
    FU, KS
    INTERNATIONAL JOURNAL OF COMPUTER & INFORMATION SCIENCES, 1979, 8 (04): : 303 - 333
  • [33] Learning through visual systems to enhance the urban planning process
    Hamilton, A
    Trodd, N
    Zhang, XN
    Fernando, T
    Watson, K
    ENVIRONMENT AND PLANNING B-PLANNING & DESIGN, 2001, 28 (06): : 833 - 845
  • [34] Contact-Based Language for Robotic Manipulation Planning
    Shah, Anuj
    Lopes, Gabriel A. D.
    Najafi, Esmaeil
    2016 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS 2016), 2016, : 5695 - 5700
  • [35] RAMP: A Benchmark for Evaluating Robotic Assembly Manipulation and Planning
    Collins J.
    Robson M.
    Yamada J.
    Sridharan M.
    Janik K.
    Posner I.
    IEEE Robotics and Automation Letters, 2024, 9 (01) : 9 - 16
  • [36] Motion planning for robotic manipulation of deformable linear objects
    Saha, Mitul
    Isto, Pekka
    2006 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), VOLS 1-10, 2006, : 2478 - +
  • [37] Motion planning for robotic manipulation of deformable linear objects
    Saha, Mitul
    Isto, Pekka
    Latombe, Jean-Claude
    EXPERIMENTAL ROBOTICS, 2008, 39 : 23 - +
  • [38] A robotic dynamic manipulation system with trajectory planning and control
    Zheng, XZ
    Ono, K
    Yamakita, M
    Katayama, M
    Ito, K
    ETFA '96 - 1996 IEEE CONFERENCE ON EMERGING TECHNOLOGIES AND FACTORY AUTOMATION, PROCEEDINGS, VOLS 1 AND 2, 1996, : 309 - 315
  • [39] Deep Visual Heuristics: Learning Feasibility of Mixed-Integer Programs for Manipulation Planning
    Driess, Danny
    Oguz, Ozgur
    Ha, Jung-Su
    Toussaint, Marc
    2020 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), 2020, : 9563 - 9569
  • [40] Abstract planning over control primitives for robotic manipulation
    Quinton, Jean-Charles
    Lengagne, Sebastien
    COGNITIVE PROCESSING, 2015, 16 : S98 - S98