Robust visual sim-to-real transfer for robotic manipulation

被引:0
|
作者
Garcia, Ricardo [1 ]
Strudel, Robin [1 ]
Chen, Shizhe [1 ]
Arlaud, Etienne [1 ]
Laptev, Ivan [1 ]
Schmid, Cordelia [1 ]
机构
[1] PSL Res Univ, CNRS, INRIA, Ecole Normale Super, F-75005 Paris, France
关键词
D O I
10.1109/IROS55552.2023.10342471
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Learning visuomotor policies in simulation is much safer and cheaper than in the real world. However, due to discrepancies between the simulated and real data, simulator-trained policies often fail when transferred to real robots. One common approach to bridge the visual sim-to-real domain gap is domain randomization (DR). While previous work mainly evaluates DR for disembodied tasks, such as pose estimation and object detection, here we systematically explore visual domain randomization methods and benchmark them on a rich set of challenging robotic manipulation tasks. In particular, we propose an off-line proxy task of cube localization to select DR parameters for texture randomization, lighting randomization, variations of object colors and camera parameters. Notably, we demonstrate that DR parameters have similar impact on our off-line proxy task and on-line policies. We, hence, use offline optimized DR parameters to train visuomotor policies in simulation and directly apply such policies to a real robot. Our approach achieves 93% success rate on average when tested on a diverse set of challenging manipulation tasks. Moreover, we evaluate the robustness of policies to visual variations in real scenes and show that our simulator-trained policies outperform policies learned using real but limited data. Code, simulation environment, real robot datasets and trained models are available at https://www.di.ens.fr/willow/research/robust_s2r/.
引用
收藏
页码:992 / 999
页数:8
相关论文
共 50 条
  • [21] SIM-TO-REAL TRANSFER OF VISUAL GROUNDING FOR HUMAN-AIDED AMBIGUITY RESOLUTION
    Tziafas, Georgios
    Kasaei, Hamidreza
    CONFERENCE ON LIFELONG LEARNING AGENTS, VOL 199, 2022, 199
  • [22] Sim-to-real via latent prediction: Transferring visual non-prehensile manipulation policies
    Rizzardo, Carlo
    Chen, Fei
    Caldwell, Darwin
    FRONTIERS IN ROBOTICS AND AI, 2023, 9
  • [23] Tracking cloth deformation: A novel dataset for closing the sim-to-real gap for robotic cloth manipulation learning
    Coltraro, Franco
    Borras, Julia
    Alberich-Carraminana, Maria
    Torras, Carme
    INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH, 2025,
  • [24] Invariance is Key to Generalization: Examining the Role of Representation in Sim-to-Real Transfer for Visual Navigation
    Ai, Bo
    Wu, Zhanxin
    Hsu, David
    EXPERIMENTAL ROBOTICS, ISER 2023, 2024, 30 : 69 - 80
  • [25] DROPO: Sim-to-real transfer with offline domain randomization
    Tiboni, Gabriele
    Arndt, Karol
    Kyrki, Ville
    ROBOTICS AND AUTONOMOUS SYSTEMS, 2023, 166
  • [26] Blind spot detection for safe sim-to-real transfer
    Ramakrishnan, Ramya
    Kamar, Ece
    Dey, Debadeepta
    Horvitz, Eric
    Shah, Julie
    Journal of Artificial Intelligence Research, 2020, 67 : 191 - 234
  • [27] Sim-to-Real Transfer of Bolting Tasks with Tight Tolerance
    Son, Dongwon
    Yang, Hyunsoo
    Lee, Dongjun
    2020 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2020, : 9056 - 9063
  • [28] Machine visual perception from sim-to-real transfer learning for autonomous docking maneuvers
    Derek Worth
    Jeffrey Choate
    Ryan Raettig
    Scott Nykl
    Clark Taylor
    Neural Computing and Applications, 2025, 37 (4) : 2285 - 2312
  • [29] Learning for Attitude Holding of a Robotic Fish: An End-to-End Approach With Sim-to-Real Transfer
    Zheng, Junzheng
    Zhang, Tianhao
    Wang, Chen
    Xiong, Minglei
    Xie, Guangming
    IEEE TRANSACTIONS ON ROBOTICS, 2022, 38 (02) : 1287 - 1303
  • [30] Blind Spot Detection for Safe Sim-to-Real Transfer
    Ramakrishnan, Ramya
    Kamar, Ece
    Dey, Debadeepta
    Horvitz, Eric
    Shah, Julie
    JOURNAL OF ARTIFICIAL INTELLIGENCE RESEARCH, 2020, 67 : 191 - 234