Learning configurations of wires for real-time shape estimation and manipulation planning

被引:1
|
作者
Mishani, Itamar [1 ]
Sintov, Avishai [2 ]
机构
[1] Carnegie Mellon Univ, Robot Inst, 5000 Forbes Ave, Pittsburgh, PA 12513 USA
[2] Tel Aviv Univ, Sch Mech Engn, Haim Levanon St, IL-6997801 Tel Aviv, Israel
关键词
Elastic wires; Convolutional autoencoder; Shape estimation; DUAL-ARM MANIPULATION; HARNESS; OBJECTS;
D O I
10.1016/j.engappai.2023.105967
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Robotic manipulation of a wire by its ends requires rapid reasoning of its shape in real-time. A recent development of an analytical model has shown that sensing of the force and torque on one end can be used to determine its shape. However, the model relies on assumptions that may not be met in real world wires and do not take into account gravity and non-linearity of the Force/Torque (F/T) sensor. Hence, the model cannot be applied to any wire with accurate shape estimation. In this paper, we explore the learning of a model to estimate the shape of a wire based solely on measurements of F/T states and without any visual perception. Visual perception is only used for off-line data collection. We propose to train a Supervised Autoencoder with convolutional layers that reconstructs the spatial shape of the wire while enforcing the latent space to resemble the space of F/T. Then, the encoder operates as a descriptor of the wire where F/T states can be mapped to its shape. On the other hand, the decoder of the model is the inverse problem where a desired goal shape can be mapped to the required F/T state. With the same collected data, we also learn the mapping from F/T states to grippers poses. Then, a motion planner can plan a path within the F/T space to a goal while avoiding obstacles. We validate the proposed data-based approach on Nitinol and standard electrical wires, and demonstrate the ability to accurately estimate their shapes.
引用
收藏
页数:11
相关论文
共 50 条
  • [1] Real-Time Finger Gaits Planning for Dexterous Manipulation
    Fan, Yongxiang
    Gao, Wei
    Chen, Wenjie
    Tomizuka, Masayoshi
    IFAC PAPERSONLINE, 2017, 50 (01): : 12765 - 12772
  • [3] Concept for the Real-Time Monitoring of Molecular Configurations during Manipulation with a Scanning Probe Microscope
    Scheidt, Joshua
    Diener, Alexander
    Maiworm, Michael
    Mueller, Klaus-Robert
    Findeisen, Rolf
    Driessens, Kurt
    Tautz, F. Stefan
    Wagner, Christian
    JOURNAL OF PHYSICAL CHEMISTRY C, 2023, 127 (28): : 13817 - 13836
  • [4] Serving Time: Real-Time, Safe Motion Planning and Control for Manipulation of Unsecured Objects
    Brei, Zachary
    Michaux, Jonathan
    Zhang, Bohao
    Holmes, Patrick
    Vasudevan, Ram
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2024, 9 (03) : 2383 - 2390
  • [5] Real-time shape estimation for continuum robots using vision
    Hannan, MW
    Walker, ID
    ROBOTICA, 2005, 23 : 645 - 651
  • [6] Real-time Yield Estimation based on Deep Learning
    Rahnemoonfar, Maryam
    Sheppard, Clay
    AUTONOMOUS AIR AND GROUND SENSING SYSTEMS FOR AGRICULTURAL OPTIMIZATION AND PHENOTYPING II, 2017, 10218
  • [7] Learning a Confidence Measure for Real-Time Egomotion Estimation
    Lessmann, Stephanie
    Westerhoff, Jens
    Meuter, Mirko
    Pauli, Josef
    PATTERN RECOGNITION, GCPR 2016, 2016, 9796 : 389 - 401
  • [8] Real-Time Non-Visual Shape Estimation and Robotic Dual-Arm Manipulation Control of an Elastic Wire
    Mishani, Itamar
    Sintov, Avishai
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2022, 7 (01) : 422 - 429
  • [9] Real-time planning of humanoid robot's gait for force controlled manipulation
    Harada, K
    Kajita, S
    Kanehiro, F
    Fujiwara, K
    Kaneko, K
    Yokoi, K
    Hirukawa, H
    2004 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, VOLS 1- 5, PROCEEDINGS, 2004, : 616 - 622
  • [10] Real-time Obstacle Avoidance in Robotic Manipulation Using Imitation Learning
    Huang, Jie
    Ge, Wei
    Cheng, Hualong
    Xi, Chun
    Zhu, Jun
    Zhang, Fei
    Shang, Weiwei
    16TH IEEE INTERNATIONAL CONFERENCE ON CONTROL, AUTOMATION, ROBOTICS AND VISION (ICARCV 2020), 2020, : 976 - 981