Training a V1 Like Layer Using Gabor Filters in Convolutional Neural Networks

被引:0
|
作者
Bai, Jun [1 ]
Zeng, Yi [1 ,2 ,3 ,4 ]
Zhao, Yuxuan [1 ]
Zhao, Feifei [1 ,2 ]
机构
[1] Chinese Acad Sci, Inst Automat, Res Ctr Brain Inspired Intelligence, Beijing, Peoples R China
[2] Univ Chinese Acad Sci, Beijing, Peoples R China
[3] Chinese Acad Sci, Ctr Excellence Brain Sci & Intelligence Technol, Shanghai, Peoples R China
[4] Chinese Acad Sci, Inst Automat, Natl Lab Pattern Recognit, Beijing, Peoples R China
关键词
RECEPTIVE-FIELDS;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
It is suggested from neuroscience that the response of V1 neurons in primate visual streams behaves quite like Gabor filters. As an inspiration from primate neural circuits to artificial neural networks, we propose to replace the first layer as a series of Gabor filters in convolutional neural networks. To enhance the performance of the neural network, we introduce a lateral inhibitory mechanism in Gabor filters, enlightened from the research results of neuroscience. To improve performance, we explore a parameter space and search the best suited parameters using cross validation. Experimental results demonstrate that the accuracy can basically match the results of the original convolutional neural networks. However, the adoption of Gabor filters can greatly reduce the time of training, as well as the memory and storage cost.
引用
收藏
页数:8
相关论文
共 50 条
  • [31] Convolutional Neural Networks with analytically determined Filters
    Kissel, Matthias
    Diepold, Klaus
    [J]. 2022 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2022,
  • [32] Deterministic Binary Filters for Convolutional Neural Networks
    Tseng, Vincent W-S
    Bhattachara, Sourav
    Fernandez-Marques, Javier
    Alizadeh, Milad
    Tong, Catherine
    Lane, Nicholas D.
    [J]. PROCEEDINGS OF THE TWENTY-SEVENTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2018, : 2739 - 2747
  • [33] Adaptive filters in Graph Convolutional Neural Networks
    Apicella, Andrea
    Isgro, Francesco
    Pollastro, Andrea
    Prevete, Roberto
    [J]. PATTERN RECOGNITION, 2023, 144
  • [34] Learning the number of filters in convolutional neural networks
    Li, Jue
    Cao, Feng
    Cheng, Honghong
    Qian, Yuhua
    [J]. INTERNATIONAL JOURNAL OF BIO-INSPIRED COMPUTATION, 2021, 17 (02) : 75 - 84
  • [35] Convolutional Filters and Neural Networks With Noncommutative Algebras
    Parada-Mayorga, Alejandro
    Butler, Landon
    Ribeiro, Alejandro
    [J]. IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2023, 71 : 2683 - 2698
  • [36] Graph Neural Networks With Convolutional ARMA Filters
    Bianchi, Filippo Maria
    Grattarola, Daniele
    Livi, Lorenzo
    Alippi, Cesare
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2022, 44 (07) : 3496 - 3507
  • [37] MIMO Graph Filters for Convolutional Neural Networks
    Gama, Fernando
    Marques, Antonio G.
    Ribeiro, Alejandro
    Leus, Geert
    [J]. 2018 IEEE 19TH INTERNATIONAL WORKSHOP ON SIGNAL PROCESSING ADVANCES IN WIRELESS COMMUNICATIONS (SPAWC), 2018, : 651 - 655
  • [38] Atomic Layer Deposition Optimization Using Convolutional Neural Networks
    Cagnazzo, Julian
    Abuomar, Osama
    Yanguas-Gil, Angel
    Elam, Jeffrey W.
    [J]. 2021 INTERNATIONAL CONFERENCE ON COMPUTATIONAL SCIENCE AND COMPUTATIONAL INTELLIGENCE (CSCI 2021), 2021, : 228 - 232
  • [39] CONVOLUTIONAL NEURAL NETWORKS WITH LAYER REUSE
    Koepueklue, Okan
    Babaee, Maryam
    Hoermann, Stefan
    Rigoll, Gerhard
    [J]. 2019 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2019, : 345 - 349
  • [40] Latent Training for Convolutional Neural Networks
    Huang, Zi
    Liu, Qi
    Chen, Zhiyuan
    Zhao, Yuming
    [J]. PROCEEDINGS OF 2015 INTERNATIONAL CONFERENCE ON ESTIMATION, DETECTION AND INFORMATION FUSION ICEDIF 2015, 2015, : 55 - 60