State Primitive Learning to Overcome Catastrophic Forgetting in Robotics

被引:0
|
作者
Fangzhou Xiong
Zhiyong Liu
Kaizhu Huang
Xu Yang
Hong Qiao
机构
[1] Institute of Automation,State Key Lab of Management and Control for Complex Systems
[2] Chinese Academy of Science,School of Artificial Intelligence
[3] University of Chinese Academy of Sciences (UCAS),Department of EEE
[4] CAS Centre for Excellence in Brain Science and Intelligence Technology (CEBSIT),undefined
[5] Xi’an Jiaotong-Liverpool University,undefined
来源
Cognitive Computation | 2021年 / 13卷
关键词
Catastrophic forgetting; State primitives; Robotics; Continual learning;
D O I
暂无
中图分类号
学科分类号
摘要
People can learn continuously a wide range of tasks without catastrophic forgetting. To mimic this functioning of continual learning, current methods mainly focus on studying a one-step supervised learning problem, e.g., image classification. They aim to retain the performance of previous image classification results when neural networks are sequentially trained on new images. In this paper, we concentrate on solving multi-step robotic tasks sequentially with the proposed architecture called state primitive learning. By projecting the original state space into a low-dimensional representation, meaningful state primitives can be generated to describe tasks. Under two kinds of different constraints on the generation of state primitives, control signals corresponding to different robotic tasks can be separately addressed only with an efficient linear regression. Experiments on several robotic manipulation tasks demonstrate the new method efficacy to learn control signals under the scenario of continual learning, delivering substantially improved performance over the other comparison methods.
引用
收藏
页码:394 / 402
页数:8
相关论文
共 50 条
  • [1] State Primitive Learning to Overcome Catastrophic Forgetting in Robotics
    Xiong, Fangzhou
    Liu, Zhiyong
    Huang, Kaizhu
    Yang, Xu
    Qiao, Hong
    [J]. COGNITIVE COMPUTATION, 2021, 13 (02) : 394 - 402
  • [2] Unsupervised Learning to Overcome Catastrophic Forgetting in Neural Networks
    Munoz-Martin, Irene
    Bianchi, Stefano
    Pedretti, Giacomo
    Melnic, Octavian
    Ambrogio, Stefano
    Ielmini, Daniele
    [J]. IEEE JOURNAL ON EXPLORATORY SOLID-STATE COMPUTATIONAL DEVICES AND CIRCUITS, 2019, 5 (01): : 58 - 66
  • [3] Generalisable deep Learning framework to overcome catastrophic forgetting
    Alammar, Zaenab
    Alzubaidi, Laith
    Zhang, Jinglan
    Li, Yuefeng
    Gupta, Ashish
    Gu, Yuantong
    [J]. INTELLIGENT SYSTEMS WITH APPLICATIONS, 2024, 23
  • [4] Multi-objective Learning to Overcome Catastrophic Forgetting in Time-series Applications
    Mahmoud, Reem A.
    Hajj, Hazem
    [J]. ACM TRANSACTIONS ON KNOWLEDGE DISCOVERY FROM DATA, 2022, 16 (06)
  • [5] Encoding primitives generation policy learning for robotic arm to overcome catastrophic forgetting in sequential multi-tasks learning
    Xiong, Fangzhou
    Liu, Zhiyong
    Huang, Kaizhu
    Yang, Xu
    Qiao, Hong
    Hussain, Amir
    [J]. NEURAL NETWORKS, 2020, 129 : 163 - 173
  • [6] Vaccine Enhanced Continual Learning With TFE to Overcome Catastrophic Forgetting for Variable Speed-Bearing Fault Diagnosis
    Wang, Lu
    Liu, Shulin
    Xiao, Haihua
    [J]. IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS, 2024, 20 (05) : 7112 - 7123
  • [7] Mitigating Catastrophic Forgetting with Complementary Layered Learning
    Mondesire, Sean
    Wiegand, R. Paul
    [J]. ELECTRONICS, 2023, 12 (03)
  • [8] Quantum Continual Learning Overcoming Catastrophic Forgetting
    Jiang, Wenjie
    Lu, Zhide
    Deng, Dong-Ling
    [J]. CHINESE PHYSICS LETTERS, 2022, 39 (05)
  • [9] Quantum Continual Learning Overcoming Catastrophic Forgetting
    蒋文杰
    鲁智徳
    邓东灵
    [J]. Chinese Physics Letters, 2022, 39 (05) : 29 - 41
  • [10] Comparative Analysis of Catastrophic Forgetting in Metric Learning
    Huo, Jiahao
    van Zyl, Terence L.
    [J]. 2020 7TH INTERNATIONAL CONFERENCE ON SOFT COMPUTING & MACHINE INTELLIGENCE (ISCMI 2020), 2020, : 68 - 72