FCNN: Fourier Convolutional Neural Networks

被引:65
|
作者
Pratt, Harry [1 ]
Williams, Bryan [1 ]
Coenen, Frans [1 ]
Zheng, Yalin [1 ]
机构
[1] Univ Liverpool, Liverpool L69 3BX, Merseyside, England
关键词
D O I
10.1007/978-3-319-71249-9_47
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The Fourier domain is used in computer vision and machine learning as image analysis tasks in the Fourier domain are analogous to spatial domain methods but are achieved using different operations. Convolutional Neural Networks (CNNs) use machine learning to achieve state-of-the-art results with respect to many computer vision tasks. One of the main limiting aspects of CNNs is the computational cost of updating a large number of convolution parameters. Further, in the spatial domain, larger images take exponentially longer than smaller image to train on CNNs due to the operations involved in convolution methods. Consequently, CNNs are often not a viable solution for large image computer vision tasks. In this paper a Fourier Convolution Neural Network (FCNN) is proposed whereby training is conducted entirely within the Fourier domain. The advantage offered is that there is a significant speed up in training time without loss of effectiveness. Using the proposed approach larger images can therefore be processed within viable computation time. The FCNN is fully described and evaluated. The evaluation was conducted using the benchmark Cifar10 and MNIST datasets, and a bespoke fundus retina image dataset. The results demonstrate that convolution in the Fourier domain gives a significant speed up without adversely affecting accuracy. For simplicity the proposed FCNN concept is presented in the context of a basic CNN architecture, however, the FCNN concept has the potential to improve the speed of any neural network system involving convolution.
引用
收藏
页码:786 / 798
页数:13
相关论文
共 50 条
  • [1] A Fourier domain acceleration framework for convolutional neural networks
    Lin, Jinhua
    Ma, Lin
    Yao, Yu
    [J]. NEUROCOMPUTING, 2019, 364 : 254 - 268
  • [2] FCNN: Simple neural networks for complex code tasks
    Sun, Xuekai
    Liu, Tieming
    Liu, Chunling
    Dong, Weiyu
    [J]. JOURNAL OF KING SAUD UNIVERSITY-COMPUTER AND INFORMATION SCIENCES, 2024, 36 (02)
  • [3] Segmentation of Histopathological Images with Convolutional Neural Networks using Fourier Features
    Hatipolu, Nuh
    Bilgin, Gokhan
    [J]. 2015 23RD SIGNAL PROCESSING AND COMMUNICATIONS APPLICATIONS CONFERENCE (SIU), 2015, : 455 - 458
  • [4] Intelligent Location of Microseismic Events Based on a Fully Convolutional Neural Network (FCNN)
    Ke Ma
    Xingye Sun
    Zhenghu Zhang
    Jing Hu
    Zuorong Wang
    [J]. Rock Mechanics and Rock Engineering, 2022, 55 : 4801 - 4817
  • [5] Intelligent Location of Microseismic Events Based on a Fully Convolutional Neural Network (FCNN)
    Ma, Ke
    Sun, Xingye
    Zhang, Zhenghu
    Hu, Jing
    Wang, Zuorong
    [J]. ROCK MECHANICS AND ROCK ENGINEERING, 2022, 55 (08) : 4801 - 4817
  • [6] Image Reconstruction for Accelerated MR Scan With Faster Fourier Convolutional Neural Networks
    Liu, Xiaohan
    Pang, Yanwei
    Sun, Xuebin
    Liu, Yiming
    Hou, Yonghong
    Wang, Zhenchang
    Li, Xuelong
    [J]. IEEE TRANSACTIONS ON IMAGE PROCESSING, 2024, 33 : 2966 - 2978
  • [7] Convolutional neural networks
    Alexander Derry
    Martin Krzywinski
    Naomi Altman
    [J]. Nature Methods, 2023, 20 : 1269 - 1270
  • [8] Convolutional neural networks
    Derry, Alexander
    Krzywinski, Martin
    Altman, Naomi
    [J]. NATURE METHODS, 2023, 20 (09) : 1269 - 1270
  • [9] The FrFT convolutional face: toward robust face recognition using the fractional Fourier transform and convolutional neural networks
    Xin Wu
    Ran Tao
    Danfeng Hong
    Yue Wang
    [J]. Science China Information Sciences, 2020, 63
  • [10] The FrFT convolutional face: toward robust face recognition using the fractional Fourier transform and convolutional neural networks
    Xin WU
    Ran TAO
    Danfeng HONG
    Yue WANG
    [J]. Science China(Information Sciences), 2020, 63 (01) : 235 - 237