A Multi-objective Optimization Model for Redundancy Reduction in Convolutional Neural Networks

被引:2
|
作者
Boufssasse, Ali [1 ]
Hssayni, El Houssaine [2 ]
Joudar, Nour-Eddine [3 ]
Ettaouil, Mohamed [1 ]
机构
[1] Sidi Mohamed Ben Abdellah Univ, Dept Math, FST Fez, Fes, Morocco
[2] Mohammed V Univ Rabat, ENSIAS, Rabat, Morocco
[3] Mohammed V Univ Rabat, ENSAM, Rabat, Morocco
关键词
Muli-objective optimization; Pareto front; NSGA-II; Convolutional neural networks; Redundancy reduction; Image classification; EVOLUTIONARY ALGORITHMS;
D O I
10.1007/s11063-023-11223-2
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Until now, convolutional neural networks (CNNs) still among the powerful and robust deep neural networks that proved its efficiency through several real applications. However, their functioning requires a large number of parameters which in turn lead to some undesired effects such as the overparametrization, overfitting and the high consumption of computational resources. To deal effectively with these issues, we propose in this paper a new multi-objective optimization model for redundancy reduction in CNNs. The suggested model named MoRR-CNN allows to eliminate the unwanted parameters (kernels and weights) as well as to speeding up the CNN evaluation process. It consists of two objectives, the first one is related to the training task where the solution is the optimal parameters. These parameters are combined with a set of decision variables that controlling their contribution in the training process, making at the end a redundancy-related objective function. Both of the objectives are optimized using the non dominated sorting genetic algorithm NSGA-II. The robustness of MoRR-CNN has been demonstrated through different experimentation applied on three benchmark datasets including MNIST, Fashion-MNIST and CIFAR and using three of the most known CNNs such as VGG-19, Net-in-Net and VGG-16.
引用
收藏
页码:9721 / 9741
页数:21
相关论文
共 50 条
  • [1] A Multi-objective Optimization Model for Redundancy Reduction in Convolutional Neural Networks
    Ali Boufssasse
    El houssaine Hssayni
    Nour-Eddine Joudar
    Mohamed Ettaouil
    Neural Processing Letters, 2023, 55 : 9721 - 9741
  • [2] Probabilistic Sequential Multi-Objective Optimization of Convolutional Neural Networks
    Yin, Zixuan
    Gross, Warren
    Meyer, Brett H.
    PROCEEDINGS OF THE 2020 DESIGN, AUTOMATION & TEST IN EUROPE CONFERENCE & EXHIBITION (DATE 2020), 2020, : 1055 - 1060
  • [3] Hybrid multi-objective evolutionary model compression with convolutional neural networks
    Zhang, Shuhan
    Gao, Yanjie
    RESULTS IN ENGINEERING, 2024, 21
  • [4] DEEP CONVOLUTIONAL NEURAL NETWORKS FOR PARETO OPTIMAL FRONT OF MULTI-OBJECTIVE OPTIMIZATION PROBLEM
    Liu, Ruilin
    Zhang, Tao
    Chen, Fang
    JOURNAL OF NONLINEAR AND CONVEX ANALYSIS, 2022, 23 (04) : 833 - 846
  • [5] Multi-objective simulated annealing for hyper-parameter optimization in convolutional neural networks
    Gulcu, Ayla
    Kus, Zeki
    PEERJ COMPUTER SCIENCE, 2021, 7 : 2 - 27
  • [6] Evolutionary multi-objective optimization of spiking neural networks
    Jin, Yaochu
    Wen, Ruojing
    Sendhoff, Bernhard
    ARTIFICIAL NEURAL NETWORKS - ICANN 2007, PT 1, PROCEEDINGS, 2007, 4668 : 370 - +
  • [7] Optimization of neural networks with multi-objective LASSO algorithm
    Costa, Marcelo Azevedo
    Braga, Antonio Padua
    2006 IEEE INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORK PROCEEDINGS, VOLS 1-10, 2006, : 3312 - +
  • [8] A multi-objective optimization approach for training artificial neural networks
    Teixeira, RD
    Braga, AD
    Takahashi, RHC
    Saldanha, RR
    SIXTH BRAZILIAN SYMPOSIUM ON NEURAL NETWORKS, VOL 1, PROCEEDINGS, 2000, : 168 - 172
  • [9] Multi-Objective Optimization for Size and Resilience of Spiking Neural Networks
    Dimovska, Mihaela
    Johnston, Travis
    Schuman, Catherine D.
    Mitchell, J. Parker
    Potok, Thomas E.
    2019 IEEE 10TH ANNUAL UBIQUITOUS COMPUTING, ELECTRONICS & MOBILE COMMUNICATION CONFERENCE (UEMCON), 2019, : 433 - 439
  • [10] A Multi-objective Particle Swarm Optimization for Neural Networks Pruning
    Wu, Tao
    Shi, Jiao
    Zhou, Deyun
    Lei, Yu
    Gong, Maoguo
    2019 IEEE CONGRESS ON EVOLUTIONARY COMPUTATION (CEC), 2019, : 570 - 577