Unified Privileged Knowledge Distillation Framework for Human Motion Prediction

被引:0
|
作者
Sun, Xiaoning [1 ]
Sun, Huaijiang [1 ]
Wei, Dong [1 ]
Wang, Jin [2 ]
Li, Bin [3 ]
Li, Weiqing [1 ]
Lu, Jianfeng [1 ]
机构
[1] Nanjing Univ Sci & Technol, Sch Comp Sci & Engn, Nanjing 210014, Peoples R China
[2] Nantong Univ, Sch Informat Sci & Technol, Nantong 226000, Peoples R China
[3] AiForward Co Ltd, Tianjin 300457, Peoples R China
基金
中国国家自然科学基金;
关键词
Predictive models; Training; Interpolation; Extrapolation; Task analysis; Knowledge engineering; Sun; Human motion prediction; privileged knowledge; knowledge distillation;
D O I
10.1109/TCSVT.2024.3440488
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Previous works on human motion prediction follow the pattern of building an extrapolation mapping between the sequence observed and the one to be predicted. However, the inherent difficulty of time-series extrapolation and complexity of human motion data still result in many failure cases. In this paper, we explore a longer horizon of sequence with more poses following behind, which breaks the limit in extrapolation problems that data/information on the other side of the predictive target is completely unknown. As these poses are unavailable for testing, we regard them as a privileged sequence, and propose a Two-stage Privileged Knowledge Distillation framework that incorporates privileged information in the forecasting process while avoiding direct use of it. Specifically, in the first stage, both the observed and privileged sequence are encoded for interpolation, with Privileged-sequence-Encoder (Priv-Encoder) learning privileged knowledge (PK) simultaneously. Then, in the second stage where privileged sequence is not observable, a novel PK-Simulator distills PK by approximating the behavior of Priv-Encoder, but only taking as input the observed sequence, to enable a PK-aware prediction pattern. Moreover, we present a One-stage version of this framework, using Shared Encoder that integrates the observation encoding in both interpolation and prediction branches to realize parallel training, which helps produce the most conducive PK to prediction pipeline. Experimental results show that our frameworks are model-agnostic, and can be applied to existing motion prediction models with encoder-decoder architecture to achieve improved performance.
引用
收藏
页码:12937 / 12948
页数:12
相关论文
共 50 条
  • [41] A Two-Teacher Framework for Knowledge Distillation
    Chen, Xingjian
    Su, Jianbo
    Zhang, Jun
    ADVANCES IN NEURAL NETWORKS - ISNN 2019, PT I, 2019, 11554 : 58 - 66
  • [42] Closed -loop unified knowledge distillation for dense object detection
    Song, Yaoye
    Zhang, Peng
    Huang, Wei
    Zha, Yufei
    You, Tao
    Zhang, Yanning
    PATTERN RECOGNITION, 2024, 149
  • [43] Revisiting Knowledge Distillation: An Inheritance and Exploration Framework
    Huang, Zhen
    Shen, Xu
    Xing, Jun
    Liu, Tongliang
    Tian, Xinmei
    Li, Houqiang
    Deng, Bing
    Huang, Jianqiang
    Hua, Xian-Sheng
    2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR 2021, 2021, : 3578 - 3587
  • [44] DS-SRD: a unified framework for structured representation distillation
    Xia, Yuelong
    Yang, Jing
    Sun, Xiaodi
    Zhang, Yungang
    MULTIMEDIA SYSTEMS, 2024, 30 (06)
  • [45] Image classification framework based on knowledge distillation
    Zhao, Hong-Wei
    Wu, Hong
    Ma, Ke
    Li, Hai
    Jilin Daxue Xuebao (Gongxueban)/Journal of Jilin University (Engineering and Technology Edition), 2024, 54 (08): : 2307 - 2312
  • [46] A Unified Framework for Real Time Motion Completion
    Duan, Yinglin
    Lin, Yue
    Zou, Zhengxia
    Yuan, Yi
    Qian, Zhehui
    Zhang, Bohan
    THIRTY-SIXTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FOURTH CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE / THE TWELVETH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2022, : 4459 - 4467
  • [47] A unified drug-target interaction prediction framework based on knowledge graph and recommendation system
    Ye, Qing
    Hsieh, Chang-Yu
    Yang, Ziyi
    Kang, Yu
    Chen, Jiming
    Cao, Dongsheng
    He, Shibo
    Hou, Tingjun
    NATURE COMMUNICATIONS, 2021, 12 (01)
  • [48] Knowledge Distillation with a Precise Teacher and Prediction with Abstention
    Xu, Yi
    Pu, Jian
    Zhao, Hui
    2020 25TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2021, : 9000 - 9006
  • [49] Ensembled CTR Prediction via Knowledge Distillation
    Zhu, Jieming
    Liu, Jinyang
    Li, Weiqi
    Lai, Jincai
    He, Xiuqiang
    Chen, Liang
    Zheng, Zibin
    CIKM '20: PROCEEDINGS OF THE 29TH ACM INTERNATIONAL CONFERENCE ON INFORMATION & KNOWLEDGE MANAGEMENT, 2020, : 2941 - 2948
  • [50] Lightweight Spectrum Prediction Based on Knowledge Distillation
    Cheng, Runmeng
    Zhang, Jianzhao
    Deng, Junquan
    Zhu, Yanping
    RADIOENGINEERING, 2023, 32 (04) : 469 - 478