A Broad Study on the Transferability of Visual Representations with Contrastive Learning

被引:26
|
作者
Islam, Ashraful [1 ]
Chen, Chun-Fu [2 ,3 ]
Panda, Rameswar [2 ,3 ]
Karlinsky, Leonid [3 ]
Radke, Richard [1 ]
Feris, Rogerio [2 ,3 ]
机构
[1] Rensselaer Polytech Inst, Troy, NY 12181 USA
[2] MIT IBM Watson AI Lab, Cambridge, MA USA
[3] IBM Res, Armonk, NY USA
关键词
D O I
10.1109/ICCV48922.2021.00872
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Tremendous progress has been made in visual representation learning, notably with the recent success of self-supervised contrastive learning methods. Supervised contrastive learning has also been shown to outperform its cross-entropy counterparts by leveraging labels for choosing where to contrast. However, there has been little work to explore the transfer capability of contrastive learning to a different domain. In this paper, we conduct a comprehensive study on the transferability of learned representations of different contrastive approaches for linear evaluation, full-network transfer, and few-shot recognition on 12 downstream datasets from different domains, and object detection tasks on MSCOCO and VOC0712. The results show that the contrastive approaches learn representations that are easily transferable to a different downstream task. We further observe that the joint objective of self-supervised contrastive loss with cross-entropy/supervised-contrastive loss leads to better transferability of these models over their supervised counterparts. Our analysis reveals that the representations learned from the contrastive approaches contain more low/mid-level semantics than cross-entropy models, which enables them to quickly adapt to a new task. Our codes and models will be publicly available to facilitate future research on transferability of visual representations.(1)
引用
收藏
页码:8825 / 8835
页数:11
相关论文
共 50 条
  • [41] Motif-Driven Contrastive Learning of Graph Representations
    Zhang, Shichang
    Hu, Ziniu
    Subramonian, Arjun
    Sun, Yizhou
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2024, 36 (08) : 4063 - 4075
  • [42] Motion-Focused Contrastive Learning of Video Representations
    Li, Rui
    Zhang, Yiheng
    Qiu, Zhaofan
    Yao, Ting
    Liu, Dong
    Mei, Tao
    2021 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2021), 2021, : 2085 - 2094
  • [43] Clustering based Contrastive Learning for Improving Face Representations
    Sharma, Vivek
    Tapaswi, Makarand
    Sarfraz, M. Saquib
    Stiefelhagen, Rainer
    2020 15TH IEEE INTERNATIONAL CONFERENCE ON AUTOMATIC FACE AND GESTURE RECOGNITION (FG 2020), 2020, : 109 - 116
  • [44] DeCLUTR: Deep Contrastive Learning for Unsupervised Textual Representations
    Giorgi, John
    Nitski, Osvald
    Wang, Bo
    Bader, Gary
    59TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS AND THE 11TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING, VOL 1 (ACL-IJCNLP 2021), 2021, : 879 - 895
  • [45] Contrastive Learning of Global-Local Video Representations
    Ma, Shuang
    Zeng, Zhaoyang
    McDuff, Daniel
    Song, Yale
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [46] MOTIF-Driven Contrastive Learning of Graph Representations
    Subramonian, Arjun
    THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 15980 - 15981
  • [47] CONTRASTIVE LEARNING OF GENERAL-PURPOSE AUDIO REPRESENTATIONS
    Saeed, Aaqib
    Grangier, David
    Zeghidour, Neil
    2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021), 2021, : 3875 - 3879
  • [48] Unsupervised Feature Learning to Improve Transferability of Landslide Susceptibility Representations
    Zhu, Qing
    Chen, Li
    Hu, Han
    Pirasteh, Saeid
    Li, Haifeng
    Xie, Xiao
    IEEE JOURNAL OF SELECTED TOPICS IN APPLIED EARTH OBSERVATIONS AND REMOTE SENSING, 2020, 13 (13) : 3917 - 3930
  • [49] Function Contrastive Learning of Transferable Meta-Representations
    Gondal, Muhammad Waleed
    Joshi, Shruti
    Rahaman, Nasim
    Bauer, Stefan
    Wuethrich, Manuel
    Schoelkopf, Bernhard
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139
  • [50] CLRGaze: Contrastive Learning of Representations for Eye Movement Signals
    Bautista, Louise Gillian C.
    Naval, Prospero C., Jr.
    29TH EUROPEAN SIGNAL PROCESSING CONFERENCE (EUSIPCO 2021), 2021, : 1241 - 1245