Learning From Visual Demonstrations via Replayed Task-Contrastive Model-Agnostic Meta-Learning

被引:1
|
作者
Hu, Ziye [1 ]
Li, Wei [1 ,2 ]
Gan, Zhongxue [1 ,2 ]
Guo, Weikun [1 ]
Zhu, Jiwei [1 ]
Wen, James Zhiqing [2 ]
Zhou, Decheng [2 ]
机构
[1] Fudan Univ, Acad Engn & Technol, Shanghai 200433, Peoples R China
[2] Ji Hua Lab, Ctr Intelligent Robot, Dept Engn Res, Foshan 528200, Guangdong, Peoples R China
关键词
Task analysis; Robots; Microstrip; Visualization; Adaptation models; Training; Reinforcement learning; Meta-learning; learning from demonstrations; one-shot visual imitation learning; learning to learn;
D O I
10.1109/TCSVT.2022.3197147
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
With the increasing application of versatile robotics, the need for end-users to teach robotic tasks via visual/video demonstrations in different environments is increasing fast. One possible method is meta-learning. However, most meta-learning methods are tailored for image classification or just focus on teaching the robot what to do, resulting in a limited ability of the robot to adapt to the real world. Thus, we propose a novel yet efficient model-agnostic meta-learning framework based on task-contrastive learning to teach the robot what to do and what not to do through positive and negative demonstrations. Our approach divides the learning procedure from visual/video demonstrations into three parts. The first part distinguishes between positive and negative demonstrations via task-contrastive learning. The second part emphasizes what the positive demo is doing, and the last part predicts what the robot needs to do. Finally, we demonstrate the effectiveness of our meta-learning approach on 1) two standard public simulated benchmarks and 2) real-world placing experiments using a UR5 robot arm, significantly outperforming current related state-of-the-art methods.
引用
收藏
页码:8756 / 8767
页数:12
相关论文
共 50 条
  • [1] Meta weight learning via model-agnostic meta-learning
    Xu, Zhixiong
    Chen, Xiliang
    Tang, Wei
    Lai, Jun
    Cao, Lei
    [J]. NEUROCOMPUTING, 2021, 432 : 124 - 132
  • [2] Task-Robust Model-Agnostic Meta-Learning
    Collins, Liam
    Mokhtari, Aryan
    Shakkottai, Sanjay
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
  • [3] Multimodal Model-Agnostic Meta-Learning via Task-Aware Modulation
    Vuorio, Risto
    Sun, Shao-Hua
    Hu, Hexiang
    Lim, Joseph J.
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [4] Probabilistic Model-Agnostic Meta-Learning
    Finn, Chelsea
    Xu, Kelvin
    Levine, Sergey
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [5] Bayesian Model-Agnostic Meta-Learning
    Yoon, Jaesik
    Kim, Taesup
    Dia, Ousmane
    Kim, Sungwoong
    Bengio, Yoshua
    Ahn, Sungjin
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [6] Visual analysis of meteorological satellite data via model-agnostic meta-learning
    Shiyu Cheng
    Hanwei Shen
    Guihua Shan
    Beifang Niu
    Weihua Bai
    [J]. Journal of Visualization, 2021, 24 : 301 - 315
  • [7] Learning Symbolic Model-Agnostic Loss Functions via Meta-Learning
    Raymond, Christian
    Chen, Qi
    Xue, Bing
    Zhang, Mengjie
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2023, 45 (11) : 13699 - 13714
  • [8] Visual analysis of meteorological satellite data via model-agnostic meta-learning
    Cheng, Shiyu
    Shen, Hanwei
    Shan, Guihua
    Niu, Beifang
    Bai, Weihua
    [J]. JOURNAL OF VISUALIZATION, 2021, 24 (02) : 301 - 315
  • [9] Knowledge Distillation for Model-Agnostic Meta-Learning
    Zhang, Min
    Wang, Donglin
    Gai, Sibo
    [J]. ECAI 2020: 24TH EUROPEAN CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2020, 325 : 1355 - 1362
  • [10] Combining Model-Agnostic Meta-Learning and Transfer Learning for Regression
    Satrya, Wahyu Fadli
    Yun, Ji-Hoon
    [J]. SENSORS, 2023, 23 (02)