Deep Contrastive Representation Learning With Self-Distillation

被引:64
|
作者
Xiao, Zhiwen [1 ,2 ,3 ]
Xing, Huanlai [1 ,2 ,3 ]
Zhao, Bowen [1 ,2 ,3 ]
Qu, Rong [4 ]
Luo, Shouxi [1 ,2 ,3 ]
Dai, Penglin [1 ,2 ,3 ]
Li, Ke [1 ,2 ,3 ]
Zhu, Zonghai [1 ,2 ,3 ]
机构
[1] Southwest Jiaotong Univ, Sch Comp & Artificial Intelligence, Chengdu 610031, Peoples R China
[2] Southwest Jiaotong Univ, Tangshan Inst, Tangshan 063000, Peoples R China
[3] Minist Educ, Engn Res Ctr Sustainable Urban Intelligent Transpo, Beijing, Peoples R China
[4] Univ Nottingham, Sch Comp Sci, Nottingham NG7 2RD, England
基金
中国国家自然科学基金;
关键词
Contrastive learning; knowledge distillation; representation learning; time series classification; time series clustering; SERIES CLASSIFICATION;
D O I
10.1109/TETCI.2023.3304948
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Recently, contrastive learning (CL) is a promising way of learning discriminative representations from time series data. In the representation hierarchy, semantic information extracted from lower levels is the basis of that captured from higher levels. Low-level semantic information is essential and should be considered in the CL process. However, the existing CL algorithms mainly focus on the similarity of high-level semantic information. Considering the similarity of low-level semantic information may improve the performance of CL. To this end, we present a deep contrastive representation learning with self-distillation (DCRLS) for the time series domain. DCRLS gracefully combine data augmentation, deep contrastive learning, and self distillation. Our data augmentation provides different views from the same sample as the input of DCRLS. Unlike most CL algorithms that concentrate on high-level semantic information only, our deep contrastive learning also considers the contrast similarity of low-level semantic information between peer residual blocks. Our self distillation promotes knowledge flow from high-level to low-level blocks to help regularize DCRLS in the knowledge transfer process. The experimental results demonstrate that the DCRLS-based structures achieve excellent performance on classification and clustering on 36 UCR2018 datasets.
引用
收藏
页码:3 / 15
页数:13
相关论文
共 50 条
  • [1] Adaptive Similarity Bootstrapping for Self-Distillation based Representation Learning
    Lebailly, Tim
    Stegmueller, Thomas
    Bozorgtabar, Behzad
    Thiran, Jean-Philippe
    Tuytelaars, Tinne
    2023 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2023), 2023, : 16459 - 16468
  • [2] Global-Local Self-Distillation for Visual Representation Learning
    Lebailly, Tim
    Tuytelaars, Tinne
    2023 IEEE/CVF WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION (WACV), 2023, : 1441 - 1450
  • [3] Deep Neural Network Self-Distillation Exploiting Data Representation Invariance
    Xu, Ting-Bing
    Liu, Cheng-Lin
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2022, 33 (01) : 257 - 269
  • [4] Robust Cross-Modal Representation Learning with Progressive Self-Distillation
    Andonian, Alex
    Chen, Shixing
    Hamid, Raffay
    2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2022), 2022, : 16409 - 16420
  • [5] Contrastive knowledge-augmented self-distillation approach for few-shot learning
    Zhang, Lixu
    Shao, Mingwen
    Chen, Sijie
    Liu, Fukang
    JOURNAL OF ELECTRONIC IMAGING, 2023, 32 (05)
  • [6] DinoSR: Self-Distillation and Online Clustering for Self-supervised Speech Representation Learning
    Liu, Alexander H.
    Chang, Heng-Jui
    Auli, Michael
    Hsu, Wei-Ning
    Glass, James
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [7] Simultaneous Similarity-based Self-Distillation for Deep Metric Learning
    Roth, Karsten
    Milbich, Timo
    Ommer, Bjorn
    Cohen, Joseph Paul
    Ghassemi, Marzyeh
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139
  • [8] Reverse Self-Distillation Overcoming the Self-Distillation Barrier
    Ni, Shuiping
    Ma, Xinliang
    Zhu, Mingfu
    Li, Xingwang
    Zhang, Yu-Dong
    IEEE OPEN JOURNAL OF THE COMPUTER SOCIETY, 2023, 4 : 195 - 205
  • [9] Multilingual Representation Distillation with Contrastive Learning
    Tan, Weiting
    Heffernan, Kevin
    Schwenk, Holger
    Koehn, Philipp
    17TH CONFERENCE OF THE EUROPEAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EACL 2023, 2023, : 1477 - 1490
  • [10] Are Your Comments Positive? A Self-Distillation Contrastive Learning Method for Analyzing Online Public Opinion
    Zhou, Dongyang
    Shi, Lida
    Wang, Bo
    Xu, Hao
    Huang, Wei
    ELECTRONICS, 2024, 13 (13)