Factorized Dynamic Fully-Connected Layers for Neural Networks

被引:1
|
作者
Babiloni, Francesca [1 ,2 ]
Tanay, Thomas [1 ]
Deng, Jiankang [1 ,2 ]
Maggioni, Matteo [1 ]
Zafeiriou, Stefanos [2 ]
机构
[1] Huawei, Noahs Ark Lab, Shenzhen, Peoples R China
[2] Imperial Coll London, London, England
关键词
D O I
10.1109/ICCVW60793.2023.00148
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The design of neural network layers plays a crucial role in determining the efficiency and performance of various computer vision tasks. However, most existing layers compromise between fast feature extraction and reasoning abilities, resulting in suboptimal outcomes. In this paper, we propose a novel and efficient operator for representation learning that can dynamically adjust to the underlying data structure. We introduce a general Dynamic Fully-Connected (DFC) layer, a non-linear extension of a Fully-Connected layer that has a learnable receptive field, is instance-adaptive, and spatially aware. We propose to use CP decomposition to reduce the complexity of the DFC layer without compromising its expressivity. Then, we leverage Summed Area Tables and Modulation to create an adaptive receptive field that can process the input with constant complexity. We evaluate the effectiveness of our method on image classification and other downstream vision tasks using both hierarchical and isotropic architectures. Our results demonstrate that our method outperforms other commonly used layers by a significant margin while keeping a fixed computational budget, therefore establishing a new strategy to efficiently design neural architectures that can capture the multi-scale features of the input without increasing complexity.
引用
收藏
页码:1366 / 1375
页数:10
相关论文
共 50 条
  • [1] On the Learnability of Fully-connected Neural Networks
    Zhang, Yuchen
    Lee, Jason D.
    Wainwright, Martin J.
    Jordan, Michael I.
    ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 54, 2017, 54 : 83 - 91
  • [2] EQUIVALENCE OF APPROXIMATION BY CONVOLUTIONAL NEURAL NETWORKS AND FULLY-CONNECTED NETWORKS
    Petersen, Philipp
    Voigtlaender, Felix
    PROCEEDINGS OF THE AMERICAN MATHEMATICAL SOCIETY, 2020, 148 (04) : 1567 - 1581
  • [3] On energy complexity of fully-connected layers
    Sima, Jiri
    Cabessa, Jeremie
    Vidnerova, Petra
    NEURAL NETWORKS, 2024, 178
  • [4] Energy Complexity of Fully-Connected Layers
    Sima, Jiri
    Cabessa, Jeremie
    ADVANCES IN COMPUTATIONAL INTELLIGENCE, IWANN 2023, PT I, 2023, 14134 : 3 - 15
  • [5] Generalization in fully-connected neural networks for time series forecasting
    Borovykh, Anastasia
    Oosterlee, Cornelis W.
    Bohte, Sander M.
    JOURNAL OF COMPUTATIONAL SCIENCE, 2019, 36
  • [6] Fully-connected networks with local connections
    P. E. Kornilovitch
    R. N. Bicknell
    J. S. Yeo
    Applied Physics A, 2009, 95 : 999 - 1004
  • [7] Fully-connected networks with local connections
    Kornilovitch, P. E.
    Bicknell, R. N.
    Yeo, J. S.
    APPLIED PHYSICS A-MATERIALS SCIENCE & PROCESSING, 2009, 95 (04): : 999 - 1004
  • [8] Open loop stability criterion for layered and fully-connected neural networks
    Snyder, M.M.
    Ferry, D.K.
    Neural Networks, 1988, 1 (1 SUPPL)
  • [9] A Design Flow Framework for Fully-Connected Neural Networks Rapid Prototyping
    Zompakis, Nikolaos
    Anagnostos, Dimitrios
    Koliogeorgi, Konstantina
    Zervakis, Georgios
    Siozios, Kostas
    INTERNATIONAL CONFERENCE ON OMNI-LAYER INTELLIGENT SYSTEMS (COINS), 2019, : 44 - 49
  • [10] Storage-Efficient Batching for Minimizing Bandwidth of Fully-Connected Neural Network Layers
    Shen, Yongming
    Ferdman, Michael
    Milder, Peter
    FPGA'17: PROCEEDINGS OF THE 2017 ACM/SIGDA INTERNATIONAL SYMPOSIUM ON FIELD-PROGRAMMABLE GATE ARRAYS, 2017, : 293 - 293