Gram regularization for sparse and disentangled representation

被引:0
|
作者
Zhentao Gao
Yuanyuan Chen
Quan Guo
Zhang Yi
机构
[1] Sichuan University,Machine Intelligence Laboratory, College of Computer Science
来源
关键词
Regularization; Sparse representation; Disentangled representation; Decision margin;
D O I
暂无
中图分类号
学科分类号
摘要
Relationship between samples is often ignored when training neural networks for classification tasks. If properly utilized, such information can bring many benefits for the trained models. On the one hand, neural networks trained ignoring similarities between samples may represent different samples closely even if they belong to different classes, which undermines discrimination abilities of the trained models. On the other hand, regularizing inter-class and intra-class similarities in the feature space during training can effectively disentangle the representation between classes and make the representation sparse. To achieve this, a new regularization method is proposed to penalize positive inter-class similarities and negative intra-class similarities in the feature space. Experimental results show that the proposed method can not only obtain sparse and disentangled representation but also improve the performance of the trained models on many datasets.
引用
收藏
页码:337 / 349
页数:12
相关论文
共 50 条
  • [1] Gram regularization for sparse and disentangled representation
    Gao, Zhentao
    Chen, Yuanyuan
    Guo, Quan
    Yi, Zhang
    PATTERN ANALYSIS AND APPLICATIONS, 2022, 25 (02) : 337 - 349
  • [2] Encouraging Disentangled and Convex Representation with Controllable Interpolation Regularization
    Ge, Yunhao
    Xu, Zhi
    Xiao, Yao
    Xin, Gan
    Pang, Yunkui
    Itti, Laurent
    2023 IEEE/CVF WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION (WACV), 2023, : 4750 - 4758
  • [3] An adaptive regularization method for sparse representation
    Xu, Bingxin
    Guo, Ping
    Chen, C. L. Philip
    INTEGRATED COMPUTER-AIDED ENGINEERING, 2014, 21 (01) : 91 - 100
  • [4] Leveraging sparse and shared feature activations for disentangled representation learning
    Fumero, Marco
    Wenzel, Florian
    Zancato, Luca
    Achille, Alessandro
    Rodola, Emanuele
    Soatto, Stefano
    Scholkopf, Bernhard
    Locatello, Francesco
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [5] Sparse Representation with Regularization Term for Face Recognition
    Ji, Jian
    Ji, Huafeng
    Bai, Mengqi
    COMPUTER VISION, CCCV 2015, PT II, 2015, 547 : 10 - 20
  • [6] Spectral regularization and sparse representation bases for interferometric imaging
    Vannier, M.
    Mary, D.
    Millour, F.
    Petrov, R. G.
    Bourguignon, S.
    Theys, C.
    OPTICAL AND INFRARED INTERFEROMETRY II, 2010, 7734
  • [7] Image superresolution by midfrequency sparse representation and total variation regularization
    Xu, Jian
    Chang, Zhiguo
    Fan, Jiulun
    Zhao, Xiaoqiang
    Wu, Xiaomin
    Wang, Yanzi
    JOURNAL OF ELECTRONIC IMAGING, 2015, 24 (01)
  • [8] A recursive least squares algorithm with ℓ1 regularization for sparse representation
    Di Liu
    Simone Baldi
    Quan Liu
    Wenwu Yu
    Science China Information Sciences, 2023, 66
  • [9] SR reconstruction algorithm of regularization based on improve of sparse representation
    Xie B.
    Wan S.
    Yin Y.
    Hongwai yu Jiguang Gongcheng/Infrared and Laser Engineering, 2022, 51 (03):
  • [10] Speech Reconstruction via Sparse Representation using Harmonic Regularization
    Tang, Yibin
    Chen, Ying
    Xu, Ning
    Zhu, Changping
    Zhou, Lin
    2015 INTERNATIONAL CONFERENCE ON WIRELESS COMMUNICATIONS & SIGNAL PROCESSING (WCSP), 2015,