DK-CNNs: Dynamic kernel convolutional neural networks

被引:6
|
作者
Liu, Jialin [1 ]
Chao, Fei [1 ,3 ]
Lin, Chih-Min [2 ]
Zhou, Changle [1 ]
Shang, Changjing [3 ]
机构
[1] Xiamen Univ, Sch Informat, Dept Artificial Intelligence, Xiamen, Peoples R China
[2] Yuan Ze Univ, Dept Elect Engn, Taoyuan 320, Taiwan
[3] Aberystwyth Univ, Inst Math Phys & Comp Sci, Aberystwyth, Dyfed, Wales
基金
中国国家自然科学基金; 欧盟地平线“2020”;
关键词
Deep neural networks; Convolutional neural networks; Convolution kernel; PCANET;
D O I
10.1016/j.neucom.2020.09.005
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper introduces dynamic kernel convolutional neural networks (DK-CNNs), an enhanced type of CNN, by performing line-by-line scanning regular convolution to generate a latent dimension of kernel weights. The proposed DK-CNN applies regular convolution to the DK weights, which rely on a latent variable, and discretizes the space of the latent variable to extend a new dimension; this process is named "DK convolution". DK convolution increases the expressive capacity of the convolution operation without increasing the number of parameters by searching for useful patterns within the new extended dimen-sion. In contrast to conventional convolution, which applies a fixed kernel to analyse the changed features, DK convolution employs a DK to analyse fixed features. In addition, DK convolution can replace a standard convolution layer in any CNN network structure. The proposed DK-CNNs were compared with different network structures with and without a latent dimension on the CIFAR and FashionMNIST data sets. The experimental results show that DK-CNNs can achieve better performance than regular CNNs. (c) 2020 Elsevier B.V. All rights reserved.
引用
收藏
页码:95 / 108
页数:14
相关论文
共 50 条
  • [1] Cellular Neural Network Friendly Convolutional Neural Networks - CNNs with CNNs
    Horvath, Andras
    Hillmer, Michael
    Lou, Qiuwen
    Hu, X. Sharon
    Niemier, Michael
    [J]. PROCEEDINGS OF THE 2017 DESIGN, AUTOMATION & TEST IN EUROPE CONFERENCE & EXHIBITION (DATE), 2017, : 145 - 150
  • [2] Convolutional neural networks (CNNs): concepts and applications in pharmacogenomics
    Joel Markus Vaz
    S. Balaji
    [J]. Molecular Diversity, 2021, 25 : 1569 - 1584
  • [3] Development of Convolutional Neural Networks (CNNs) for Feature Extraction
    Eikmeier, Nicole
    Westerkamp, Rachel
    Zelnio, Edmund
    [J]. ALGORITHMS FOR SYNTHETIC APERTURE RADAR IMAGERY XXV, 2018, 10647
  • [4] Convolutional neural networks (CNNs): concepts and applications in pharmacogenomics
    Vaz, Joel Markus
    Balaji, S.
    [J]. MOLECULAR DIVERSITY, 2021, 25 (03) : 1569 - 1584
  • [5] Kernel Pooling for Convolutional Neural Networks
    Cui, Yin
    Zhou, Feng
    Wang, Jiang
    Liu, Xiao
    Lin, Yuanqing
    Belongie, Serge
    [J]. 30TH IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2017), 2017, : 3049 - 3058
  • [6] Kernel Graph Convolutional Neural Networks
    Nikolentzos, Giannis
    Meladianos, Polykarpos
    Tixier, Antoine Jean-Pierre
    Skianis, Konstantinos
    Vazirgiannis, Michalis
    [J]. ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2018, PT I, 2018, 11139 : 22 - 32
  • [7] Classifiers Comparison for Convolutional Neural Networks (CNNs) in Image Classification
    Tropea, Mauro
    Fedele, Giuseppe
    [J]. 2019 IEEE/ACM 23RD INTERNATIONAL SYMPOSIUM ON DISTRIBUTED SIMULATION AND REAL TIME APPLICATIONS (DS-RT), 2019, : 310 - 313
  • [8] The Kernel Dynamics of Convolutional Neural Networks in Manifolds
    WU Wei
    JING Xiaoyuan
    DU Wencai
    [J]. Chinese Journal of Electronics, 2020, 29 (06) : 1185 - 1192
  • [9] On the regularization of convolutional kernel tensors in neural networks
    Guo, Pei-Chang
    Ye, Qiang
    [J]. LINEAR & MULTILINEAR ALGEBRA, 2022, 70 (12): : 2318 - 2330
  • [10] Vector-kernel convolutional neural networks
    Ou, Jun
    Li, Yujian
    [J]. NEUROCOMPUTING, 2019, 330 : 253 - 258