Learning From Visual Demonstrations via Replayed Task-Contrastive Model-Agnostic Meta-Learning

被引:1
|
作者
Hu, Ziye [1 ]
Li, Wei [1 ,2 ]
Gan, Zhongxue [1 ,2 ]
Guo, Weikun [1 ]
Zhu, Jiwei [1 ]
Wen, James Zhiqing [2 ]
Zhou, Decheng [2 ]
机构
[1] Fudan Univ, Acad Engn & Technol, Shanghai 200433, Peoples R China
[2] Ji Hua Lab, Ctr Intelligent Robot, Dept Engn Res, Foshan 528200, Guangdong, Peoples R China
关键词
Task analysis; Robots; Microstrip; Visualization; Adaptation models; Training; Reinforcement learning; Meta-learning; learning from demonstrations; one-shot visual imitation learning; learning to learn;
D O I
10.1109/TCSVT.2022.3197147
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
With the increasing application of versatile robotics, the need for end-users to teach robotic tasks via visual/video demonstrations in different environments is increasing fast. One possible method is meta-learning. However, most meta-learning methods are tailored for image classification or just focus on teaching the robot what to do, resulting in a limited ability of the robot to adapt to the real world. Thus, we propose a novel yet efficient model-agnostic meta-learning framework based on task-contrastive learning to teach the robot what to do and what not to do through positive and negative demonstrations. Our approach divides the learning procedure from visual/video demonstrations into three parts. The first part distinguishes between positive and negative demonstrations via task-contrastive learning. The second part emphasizes what the positive demo is doing, and the last part predicts what the robot needs to do. Finally, we demonstrate the effectiveness of our meta-learning approach on 1) two standard public simulated benchmarks and 2) real-world placing experiments using a UR5 robot arm, significantly outperforming current related state-of-the-art methods.
引用
收藏
页码:8756 / 8767
页数:12
相关论文
共 50 条
  • [41] Convolutional Shrinkage Neural Networks Based Model-Agnostic Meta-Learning for Few-Shot Learning
    He, Yunpeng
    Zang, Chuanzhi
    Zeng, Peng
    Dong, Qingwei
    Liu, Ding
    Liu, Yuqi
    [J]. NEURAL PROCESSING LETTERS, 2023, 55 (01) : 505 - 518
  • [42] Kronecker-factored Approximate Curvature with adaptive learning rate for optimizing model-agnostic meta-learning
    Ce Zhang
    Xiao Yao
    Changfeng Shi
    Min Gu
    [J]. Multimedia Systems, 2023, 29 (6) : 3169 - 3177
  • [43] Meta-LSTM in hydrology: Advancing runoff predictions through model-agnostic meta-learning
    Cai, Kaixuan
    He, Jinxin
    Li, Qingliang
    Wei, Shangguan
    Li, Lu
    Hu, Huiming
    [J]. JOURNAL OF HYDROLOGY, 2024, 639
  • [44] Model-Agnostic Meta-Learning for Fast Text-Dependent Speaker Embedding Adaptation
    Lin, Weiwei
    Mak, Man-Wai
    [J]. IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING, 2023, 31 : 1866 - 1876
  • [45] Few-Shot Bearing Fault Diagnosis Based on Model-Agnostic Meta-Learning
    Zhang, Shen
    Ye, Fei
    Wang, Bingnan
    Habetler, Thomas G.
    [J]. IEEE TRANSACTIONS ON INDUSTRY APPLICATIONS, 2021, 57 (05) : 4754 - 4764
  • [46] Model-Agnostic Zero-Shot Intent Detection via Contrastive Transfer Learning
    Maqbool, M. H.
    Fereidouni, Moghis
    Siddique, A. B.
    Foroosh, Hassan
    [J]. INTERNATIONAL JOURNAL OF SEMANTIC COMPUTING, 2024, 18 (01) : 5 - 24
  • [47] Bayesian Model-Agnostic Meta-Learning with Matrix-Valued Kernels for Quality Estimation
    Obamuyide, Abiola
    Fomicheva, Marina
    Specia, Lucia
    [J]. REPL4NLP 2021: PROCEEDINGS OF THE 6TH WORKSHOP ON REPRESENTATION LEARNING FOR NLP, 2021, : 223 - 230
  • [48] Improving meta-learning model via meta-contrastive loss
    Pinzhuo Tian
    Yang Gao
    [J]. Frontiers of Computer Science, 2022, 16
  • [49] Improving meta-learning model via meta-contrastive loss
    Pinzhuo TIAN
    Yang GAO
    [J]. Frontiers of Computer Science., 2022, 16 (05) - 112
  • [50] Improving meta-learning model via meta-contrastive loss
    Tian, Pinzhuo
    Gao, Yang
    [J]. FRONTIERS OF COMPUTER SCIENCE, 2022, 16 (05)