GenURL: A General Framework for Unsupervised Representation Learning

被引:0
|
作者
Li, Siyuan [1 ]
Liu, Zicheng [1 ]
Zang, Zelin [2 ]
Wu, Di [2 ]
Chen, Zhiyuan [2 ]
Li, Stan Z. [2 ]
机构
[1] Zhejiang Univ, Hangzhou 310000, Peoples R China
[2] Westlake Univ, Sch Engn, AI Div, Hangzhou 310030, Peoples R China
基金
中国国家自然科学基金;
关键词
Task analysis; Uniform resource locators; Germanium; Data models; Manifolds; Data structures; Representation learning; Contrastive learning (CL); dimension reduction (DR); graph embedding (GE); knowledge distillation (KD); self-supervised learning; NONLINEAR DIMENSIONALITY REDUCTION;
D O I
10.1109/TNNLS.2023.3332087
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Unsupervised representation learning (URL) that learns compact embeddings of high-dimensional data without supervision has achieved remarkable progress recently. However, the development of URLs for different requirements is independent, which limits the generalization of the algorithms, especially prohibitive as the number of tasks grows. For example, dimension reduction (DR) methods, t-SNE and UMAP, optimize pairwise data relationships by preserving the global geometric structure, while self-supervised learning, SimCLR and BYOL, focuses on mining the local statistics of instances under specific augmentations. To address this dilemma, we summarize and propose a unified similarity-based URL framework, GenURL, which can adapt to various URL tasks smoothly. In this article, we regard URL tasks as different implicit constraints on the data geometric structure that help to seek optimal low-dimensional representations that boil down to data structural modeling (DSM) and low-dimensional transformation (LDT). Specifically, DSM provides a structure-based submodule to describe the global structures, and LDT learns compact low-dimensional embeddings with given pretext tasks. Moreover, an objective function, general Kullback-Leibler (GKL) divergence, is proposed to connect DSM and LDT naturally. Comprehensive experiments demonstrate that GenURL achieves consistent state-of-the-art performance in self-supervised visual learning, unsupervised knowledge distillation (KD), graph embeddings (GEs), and DR.
引用
收藏
页码:286 / 298
页数:13
相关论文
共 50 条
  • [41] Unsupervised representation learning with Minimax distance measures
    Haghir Chehreghani, Morteza
    MACHINE LEARNING, 2020, 109 (11) : 2063 - 2097
  • [42] Online Deep Clustering for Unsupervised Representation Learning
    Zhan, Xiaohang
    Xie, Jiahao
    Liu, Ziwei
    Ong, Yew-Soon
    Loy, Chen Change
    2020 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2020, : 6687 - 6696
  • [43] Disentangled Representation Learning for Unsupervised Neural Quantization
    Noh, Haechan
    Hyun, Sangeek
    Jeong, Woojin
    Lim, Hanshin
    Heo, Jae-Pil
    2023 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2023, : 12001 - 12010
  • [44] An Unsupervised Autoregressive Model for Speech Representation Learning
    Chung, Yu-An
    Hsu, Wei-Ning
    Tang, Hao
    Glass, James
    INTERSPEECH 2019, 2019, : 146 - 150
  • [45] Jigsaw Clustering for Unsupervised Visual Representation Learning
    Chen, Pengguang
    Liu, Shu
    Jia, Jiaya
    2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR 2021, 2021, : 11521 - 11530
  • [46] Unsupervised Feature Recommendation using Representation Learning
    Datta, Anish
    Bandyopadhyay, Soma
    Sachan, Shruti
    Pal, Arpan
    2022 30TH EUROPEAN SIGNAL PROCESSING CONFERENCE (EUSIPCO 2022), 2022, : 1591 - 1595
  • [47] Unsupervised representation learning with Minimax distance measures
    Morteza Haghir Chehreghani
    Machine Learning, 2020, 109 : 2063 - 2097
  • [48] Unsupervised Representation Learning on Attributed Multiplex Network
    Zhang, Rui
    Zimek, Arthur
    Schneider-Kamp, Peter
    PROCEEDINGS OF THE 31ST ACM INTERNATIONAL CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, CIKM 2022, 2022, : 2610 - 2619
  • [49] Adversarial Fisher Vectors for Unsupervised Representation Learning
    Zhai, Shuangfei
    Talbott, Walter
    Guestrin, Carlos
    Susskind, Joshua M.
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [50] A Theoretical Analysis of Contrastive Unsupervised Representation Learning
    Arora, Sanjeev
    Khandeparkar, Hrishikesh
    Khodak, Mikhail
    Plevrakis, Orestis
    Saunshi, Nikunj
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 97, 2019, 97