One-Shot Imitation from Observing Humans via Domain-Adaptive Meta-Learning

被引:0
|
作者
Yu, Tianhe [1 ]
Finn, Chelsea [1 ]
Xie, Annie [1 ]
Dasari, Sudeep [1 ]
Zhang, Tianhao [1 ]
Abbeel, Pieter [1 ]
Levine, Sergey [1 ]
机构
[1] Univ Calif Berkeley, Berkeley, CA 94720 USA
关键词
D O I
暂无
中图分类号
TP24 [机器人技术];
学科分类号
080202 ; 1405 ;
摘要
Humans and animals are capable of learning a new behavior by observing others perform the skill just once. We consider the problem of allowing a robot to do the same - learning from a video of a human, even when there is domain shift in the perspective, environment, and embodiment between the robot and the observed human. Prior approaches to this problem have hand-specified how human and robot actions correspond and often relied on explicit human pose detection systems. In this work, we present an approach for one-shot learning from a video of a human by using human and robot demonstration data from a variety of previous tasks to build up prior knowledge through meta-learning. Then, combining this prior knowledge and only a single video demonstration from a human, the robot can perform the task that the human demonstrated. We show experiments on both a PR2 arm and a Sawyer arm, demonstrating that after meta-learning, the robot can learn to place, push, and pick-and-place new objects using just one video of a human performing the manipulation.
引用
收藏
页数:10
相关论文
共 50 条
  • [1] One-Shot Domain-Adaptive Imitation Learning via Progressive Learning Applied to Robotic Pouring
    Zhang, Dandan
    Fan, Wen
    Lloyd, John
    Yang, Chenguang
    Lepora, Nathan F. F.
    IEEE TRANSACTIONS ON AUTOMATION SCIENCE AND ENGINEERING, 2024, 21 (01) : 541 - 554
  • [2] Domain-Adaptive Discriminative One-Shot Learning of Gestures
    Pfister, Tomas
    Charles, James
    Zisserman, Andrew
    COMPUTER VISION - ECCV 2014, PT VI, 2014, 8694 : 814 - 829
  • [3] Learning One-Shot Imitation From Humans Without Humans
    Bonardi, Alessandro
    James, Stephen
    Davison, Andrew J.
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2020, 5 (02): : 3533 - 3539
  • [4] One-Shot Imitation Learning
    Duan, Yan
    Andrychowicz, Marcin
    Stadie, Bradly
    Ho, Jonathan
    Schneider, Jonas
    Sutskeyer, Ilya
    Abbeel, Pieter
    Zaremba, Wojciech
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 30 (NIPS 2017), 2017, 30
  • [5] Learning from Demonstrations via Deformable Residual Multi-Attention Domain-Adaptive Meta-Learning
    Yan, Zeyu
    Gan, Zhongxue
    Lu, Gaoxiong
    Liu, Junxiu
    Li, Wei
    BIOMIMETICS, 2025, 10 (02)
  • [6] One-shot Imitation Learning via Interaction Warping
    Biza, Ondrej
    Thompson, Skye
    Pagidi, Kishore Reddy
    Kumar, Abhinav
    van der Pol, Elise
    Walters, Robin
    Kipf, Thomas
    van de Meent, Jan-Willem
    Wong, Lawson L. S.
    Platt, Robert
    CONFERENCE ON ROBOT LEARNING, VOL 229, 2023, 229
  • [7] Learning With Dual Demonstration Domains: Random Domain-Adaptive Meta-Learning
    Hu, Ziye
    Li, Wei
    Gan, Zhongxue
    Guo, Weikun
    Zhu, Jiwei
    Gao, Xiang
    Yang, Xuyun
    Peng, Yueyan
    Zuo, Zhihao
    Wen, James Zhiqing
    Zhou, Decheng
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2022, 7 (02) : 3523 - 3530
  • [8] Learning From Demonstrations Via Multi-Level and Multi-Attention Domain-Adaptive Meta-Learning
    Hu, Ziye
    Gan, Zhongxue
    Li, Wei
    Guo, Weikun
    Gao, Xiang
    Zhu, Jiwei
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2022, 7 (04): : 11910 - 11917
  • [9] Two-Stage Model-Agnostic Meta-Learning With Noise Mechanism for One-Shot Imitation
    Hu, Ziye
    Gan, Zhongxue
    Li, Wei
    Wen, James Zhiqing
    Zhou, Decheng
    Wang, Xusheng
    IEEE ACCESS, 2020, 8 : 182720 - 182730
  • [10] Attentive One-Shot Meta-Imitation Learning From Visual Demonstration
    Bhutani, Vishal
    Majumder, Anima
    Vankadari, Madhu
    Dutta, Samrat
    Asati, Aaditya
    Kumar, Swagat
    2022 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, ICRA 2022, 2022, : 8584 - 8590