Learning Robotic Manipulation through Visual Planning and Acting

被引:0
|
作者
Wang, Angelina [1 ]
Kurutach, Thanard [1 ]
Liu, Kara [1 ]
Abbeel, Pieter [1 ]
Tamar, Aviv [2 ]
机构
[1] Univ Calif Berkeley, EECS Dept, Berkeley, CA 94720 USA
[2] Technion, Dept Elect Engn, Haifa, Israel
关键词
D O I
暂无
中图分类号
TP24 [机器人技术];
学科分类号
080202 ; 1405 ;
摘要
Planning for robotic manipulation requires reasoning about the changes a robot can affect on objects. When such interactions can be modelled analytically, as in domains with rigid objects, efficient planning algorithms exist. However, in both domestic and industrial domains, the objects of interest can be soft, or deformable, and hard to model analytically. For such cases, we posit that a data-driven modelling approach is more suitable. In recent years, progress in deep generative models has produced methods that learn to 'imagine' plausible images from data. Building on the recent Causal InfoGAN generative model, in this work we learn to imagine goal-directed object manipulation directly from raw image data of self-supervised interaction of the robot with the object. After learning, given a goal observation of the system, our model can generate an imagined plan - a sequence of images that transition the object into the desired goal. To execute the plan, we use it as a reference trajectory to track with a visual servoing controller, which we also learn from the data as an inverse dynamics model. In a simulated manipulation task, we show that separating the problem into visual planning and visual tracking control is more sample efficient and more interpretable than alternative data-driven approaches. We further demonstrate our approach on learning to imagine and execute in 3 environments, the final of which is deformable rope manipulation on a PR2 robot.
引用
下载
收藏
页数:10
相关论文
共 50 条
  • [41] Adaptive Optimization of Hyper-Parameters for Robotic Manipulation through Evolutionary Reinforcement Learning
    Onori, Giulio
    Shahid, Asad Ali
    Braghin, Francesco
    Roveda, Loris
    JOURNAL OF INTELLIGENT & ROBOTIC SYSTEMS, 2024, 110 (03)
  • [42] On motion planning for robotic manipulation with permanent rolling contacts
    Kiss, M
    Lévine, J
    Lantos, B
    INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH, 2002, 21 (5-6): : 443 - 461
  • [43] Interlinked Visual Tracking and Robotic Manipulation of Articulated Objects
    Paolillo, Antonio
    Chappellet, Kevin
    Bolotnikova, Anastasia
    Kheddar, Abderrahmane
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2018, 3 (04): : 2746 - 2753
  • [44] Visual Repetition Sampling for Robot Manipulation Planning
    Puang, En Yen
    Lehner, Peter
    Marton, Zoltan-Csaba
    Durner, Maximilian
    Triebel, Rudolph
    Albu-Schaeffer, Alin
    2019 INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), 2019, : 9236 - 9242
  • [45] Learning Sequences of Manipulation Primitives for Robotic Assembly
    Nghia Vuong
    Hung Pham
    Quang-Cuong Pham
    2021 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA 2021), 2021, : 4086 - 4092
  • [46] Remote learning for the manipulation and control of robotic cells
    Goldstain, O.
    Ben-Gal, I.
    Bukchin, Andy.
    EUROPEAN JOURNAL OF ENGINEERING EDUCATION, 2007, 32 (04) : 481 - 494
  • [47] Machine learning meets advanced robotic manipulation
    Nahavandi, Saeid
    Alizadehsani, Roohallah
    Nahavandi, Darius
    Lim, Chee Peng
    Kelly, Kevin
    Bello, Fernando
    INFORMATION FUSION, 2024, 105
  • [48] Towards Learning of Generic Skills for Robotic Manipulation
    Metzen, Jan Hendrik
    Fabisch, Alexander
    Senger, Lisa
    Fernandez, Jose de Gea
    Kirchner, Elsa Andrea
    KUNSTLICHE INTELLIGENZ, 2014, 28 (01): : 15 - 20
  • [49] Learning to Design and Use Tools for Robotic Manipulation
    Liu, Ziang
    Tian, Stephen
    Guo, Michelle
    Liu, C. Karen
    Wu, Jiajun
    CONFERENCE ON ROBOT LEARNING, VOL 229, 2023, 229
  • [50] Learning to Scaffold the Development of Robotic Manipulation Skills
    Shao, Lin
    Migimatsu, Toki
    Bohg, Jeannette
    2020 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), 2020, : 5671 - 5677