Tracking cloth deformation: A novel dataset for closing the sim-to-real gap for robotic cloth manipulation learning

被引:1
|
作者
Coltraro, Franco [1 ,2 ,3 ]
Borras, Julia [1 ]
Alberich-Carraminana, Maria [1 ,2 ,3 ]
Torras, Carme [1 ]
机构
[1] CSIC UPC, Inst Robot & Informat Ind, Llorens & Artigas 4-6, Barcelona 08028, Spain
[2] Univ Politecn Catalunya BarcelonaTech, UPC BarcelonaTech IMTech, Dept Matemat, Barcelona, Spain
[3] Univ Politecn Catalunya BarcelonaTech, UPC BarcelonaTech IMTech, Inst Matemat, Barcelona, Spain
关键词
Cloth manipulation; real datasets; robotic learning; motion capture; cloth simulation; sim-to-real gap;
D O I
10.1177/02783649251317617
中图分类号
TP24 [机器人技术];
学科分类号
080202 ; 1405 ;
摘要
Robotic learning for deformable object manipulation-such as textiles-is often done in simulation due to the current limitation of perception methods to understand cloth's deformation. For this reason, the robotics community is always on the search for more realistic simulators to reduce as much as possible the sim-to-real gap, which is still quite large especially when dynamic motions are applied. We present a cloth dataset consisting of 120 high-quality recordings of several textiles during dynamic motions. Using a Motion Capture System, we record the location of key-points on the cloth surface of four types of fabrics (cotton, denim, wool and polyester) of two sizes and at different speeds. The scenarios considered are all dynamic and involve rapid shaking and twisting of the textiles, collisions with frictional objects, strong hits with a long and thin rigid object and even self-collisions. We explain in detail the scenarios considered, the collected data and how to read it and use it. In addition, we propose a metric to use the dataset as a benchmark to quantify the sim-to-real gap of any cloth simulator. Finally, we show that the recorded trajectories can be directly executed by a robotic arm, enabling learning by demonstration and other imitation learning techniques.Dataset: https://doi.org/10.5281/zenodo.14644526Video: https://fcoltraro.github.io/projects/dataset/
引用
收藏
页数:12
相关论文
共 44 条
  • [1] Benchmarking the Sim-to-Real Gap in Cloth Manipulation
    Blanco-Mulero, David
    Barbany, Oriol
    Alcan, Gokhan
    Colome, Adria
    Torras, Carme
    Kyrki, Ville
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2024, 9 (03) : 2981 - 2988
  • [2] Learning Sim-to-Real Dense Object Descriptors for Robotic Manipulation
    Cao, Hoang-Giang
    Zeng, Weihao
    Wu, I-Chen
    2023 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA 2023), 2023, : 9501 - 9507
  • [3] Robust visual sim-to-real transfer for robotic manipulation
    Garcia, Ricardo
    Strudel, Robin
    Chen, Shizhe
    Arlaud, Etienne
    Laptev, Ivan
    Schmid, Cordelia
    2023 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS, IROS, 2023, : 992 - 999
  • [4] Sim-to-Real Transfer for Robotic Manipulation with Tactile Sensory
    Ding, Zihan
    Tsai, Ya-Yen
    Lee, Wang Wei
    Huang, Bidan
    2021 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2021, : 6778 - 6785
  • [5] Closing the Reality Gap with Unsupervised Sim-to-Real Image Translation
    Blumenkamp, Jan
    Baude, Andreas
    Laue, Tim
    ROBOT WORLD CUP XXIV, ROBOCUP 2021, 2022, 13132 : 127 - 139
  • [6] Bayesian Active Learning for Sim-to-Real Robotic Perception
    Feng, Jianxiang
    Lee, Jongseok
    Durner, Maximilian
    Triebel, Rudolph
    2022 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2022, : 10820 - 10827
  • [7] Self-Supervised Sim-to-Real Adaptation for Visual Robotic Manipulation
    Jeong, Rae
    Aytar, Yusuf
    Khosid, David
    Zhou, Yuxiang
    Kay, Jackie
    Lampe, Thomas
    Bousmalis, Konstantinos
    Nori, Francesco
    2020 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), 2020, : 2718 - 2724
  • [8] Learning Keypoints for Robotic Cloth Manipulation Using Synthetic Data
    Lips, Thomas
    De Gusseme, Victor-Louis
    Wyffels, Francis
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2024, 9 (07): : 6528 - 6535
  • [9] Seg-CURL: Segmented Contrastive Unsupervised Reinforcement Learning for Sim-to-Real in Visual Robotic Manipulation
    Xu, Binzhao
    Hassan, Taimur
    Hussain, Irfan
    IEEE ACCESS, 2023, 11 : 50195 - 50204
  • [10] On the Role of the Action Space in Robot Manipulation Learning and Sim-to-Real Transfer
    Aljalbout, Elie
    Frank, Felix
    Karl, Maximilian
    van der Smagt, Patrick
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2024, 9 (06): : 5895 - 5902