Structured pruning of neural networks for constraints learning

被引:0
|
作者
Cacciola, Matteo [1 ]
Frangioni, Antonio [2 ]
Lodi, Andrea [3 ]
机构
[1] CERC, Polytechnique Montréal, Montréal, Canada
[2] University of Pisa, Pisa, Italy
[3] Cornell Tech, Technion – IIT, New York, United States
关键词
Adversarial machine learning - Integer linear programming - Neural network models;
D O I
10.1016/j.orl.2024.107194
中图分类号
学科分类号
摘要
In recent years, the integration of Machine Learning (ML) models with Operation Research (OR) tools has gained popularity in applications such as cancer treatment, algorithmic configuration, and chemical process optimization. This integration often uses Mixed Integer Programming (MIP) formulations to represent the chosen ML model, that is often an Artificial Neural Networks (ANNs) due to their widespread use. However, ANNs frequently contain a large number of parameters, resulting in MIP formulations impractical to solve. In this paper we showcase the effectiveness of a ANN pruning, when applied to models prior to their integration into MIPs. We discuss why pruning is more suitable in this context than other ML compression techniques, and we highlight the potential of appropriate pruning strategies via experiments on MIPs used to construct adversarial examples to ANNs. Our results demonstrate that pruning offers remarkable reductions in solution times without hindering the quality of the final decision, enabling the resolution of previously unsolvable instances. © 2024 Elsevier B.V.
引用
下载
收藏
相关论文
共 50 条
  • [41] Learning Understandable Neural Networks With Nonnegative Weight Constraints
    Chorowski, Jan
    Zurada, Jacek M.
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2015, 26 (01) : 62 - 69
  • [42] Structured Pruning of RRAM Crossbars for Efficient In-Memory Computing Acceleration of Deep Neural Networks
    Meng, Jian
    Yang, Li
    Peng, Xiaochen
    Yu, Shimeng
    Fan, Deliang
    Seo, Jae-Sun
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS II-EXPRESS BRIEFS, 2021, 68 (05) : 1576 - 1580
  • [43] Sparsity in deep learning: Pruning and growth for efficient inference and training in neural networks
    Hoefler, Torsten
    Alistarh, Dan
    Ben-Nun, Tal
    Dryden, Nikoli
    Peste, Alexandra
    Journal of Machine Learning Research, 2021, 22
  • [44] Learning Contact Dynamics using Physically Structured Neural Networks
    Hochlehnert, Andreas
    Terenin, Alexander
    Saemundsson, Steindor
    Deisenroth, Marc Peter
    24TH INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS (AISTATS), 2021, 130
  • [45] Interpretable Task-inspired Adaptive Filter Pruning for Neural Networks Under Multiple Constraints
    Guo, Yang
    Gao, Wei
    Li, Ge
    INTERNATIONAL JOURNAL OF COMPUTER VISION, 2024, 132 (06) : 2060 - 2076
  • [46] Structured pruning via feature channels similarity and mutual learning for convolutional neural network compression
    Wei Yang
    Yancai Xiao
    Applied Intelligence, 2022, 52 : 14560 - 14570
  • [47] Structured learning via convolutional neural networks for vehicle detection
    Maqueda, Ana I.
    del Blanco, Carlos R.
    Jaureguizar, Fernando
    Garcia, Narciso
    REAL-TIME IMAGE AND VIDEO PROCESSING 2017, 2017, 10223
  • [48] LEARNING AND CONVERGENCE ANALYSIS OF NEURAL-TYPE STRUCTURED NETWORKS
    POLYCARPOU, MM
    IOANNOU, PA
    IEEE TRANSACTIONS ON NEURAL NETWORKS, 1992, 3 (01): : 39 - 50
  • [49] Spectral Pruning for Recurrent Neural Networks
    Furuya, Takashi
    Suetake, Kazuma
    Taniguchi, Koichi
    Kusumoto, Hiroyuki
    Saiin, Ryuji
    Daimon, Tomohiro
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 151, 2022, 151
  • [50] On the use of a pruning prior for neural networks
    Goutte, C
    NEURAL NETWORKS FOR SIGNAL PROCESSING VI, 1996, : 52 - 61