A Min-Max Optimization Framework for Multi-task Deep Neural Network Compression

被引:0
|
作者
Guo, Jiacheng [1 ]
Sun, Huiming [1 ]
Qin, Minghai [1 ]
Yu, Hongkai [1 ]
Zhang, Tianyun [1 ]
机构
[1] Cleveland State Univ, Cleveland, OH 44115 USA
基金
美国国家科学基金会;
关键词
multi-task learning; deep learning; weight pruning; model compression;
D O I
10.1109/ISCAS58744.2024.10557958
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Multi-task learning is a subfield of machine learning in which the data is trained with a shared model to solve different tasks simultaneously. Instead of training multiple models corresponding to different tasks, we only need to train a single model with shared parameters by using multi-task learning. Multi-task learning highly reduces the number of parameters in the machine learning models and thus reduces the computational and storage requirements. When we apply multi-task learning on deep neural networks (DNNs), we need to further compress the model since the model size of a single DNN is still a critical challenge to many computation systems, especially for edge platforms. However, when model compression is applied to multi-task learning, it is challenging to maintain the performance of all the different tasks. To deal with this challenge, we propose a min-max optimization framework for the training of highly compressed multi-task DNN models. Our proposed framework can automatically adjust the learnable weighting factors corresponding to different tasks to guarantee that the task with worst-case performance across all the different tasks will be optimized.
引用
收藏
页数:5
相关论文
共 50 条
  • [1] Multi-block Min-max Bilevel Optimization with Applications in Multi-task Deep AUC Maximization
    Hu, Quanqi
    Zhong, Yongjian
    Yang, Tianbao
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [2] Deep Fuzzy Min-Max Neural Network: Analysis and Design
    Huang, Wei
    Sun, Mingxi
    Zhu, Liehuang
    Oh, Sung-Kwun
    Pedrycz, Witold
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (06) : 8229 - 8240
  • [3] Improving Deep Neural Network Performance with Kernelized Min-Max Objective
    Yao, Kai
    Huang, Kaizhu
    Zhang, Rui
    Hussain, Amir
    [J]. NEURAL INFORMATION PROCESSING (ICONIP 2018), PT I, 2018, 11301 : 182 - 191
  • [4] Multi-Level Fuzzy Min-Max Neural Network Classifier
    Davtalab, Reza
    Dezfoulian, Mir Hossein
    Mansoorizadeh, Muharram
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2014, 25 (03) : 470 - 482
  • [5] Redefined Fuzzy Min-Max Neural Network
    Wang, Yage
    Huang, Wei
    Wang, Jinsong
    [J]. 2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [6] A SIMPLE ALGORITHM FOR MIN-MAX NETWORK OPTIMIZATION
    DIMAIO, B
    SORBELLO, F
    [J]. ALTA FREQUENZA, 1988, 57 (05): : 259 - 265
  • [7] Improving deep neural network performance by integrating kernelized Min-Max objective
    Wang, Qiu-Feng
    Yao, Kai
    Zhang, Rui
    Hussain, Amir
    Huang, Kaizhu
    [J]. NEUROCOMPUTING, 2020, 408 : 82 - 90
  • [8] Multimodal multi-task deep neural network framework for kinase–target prediction
    Yi Hua
    Lin Luo
    Haodi Qiu
    Dingfang Huang
    Yang Zhao
    Haichun Liu
    Tao Lu
    Yadong Chen
    Yanmin Zhang
    Yulei Jiang
    [J]. Molecular Diversity, 2023, 27 : 2491 - 2503
  • [9] ConnectomeNet: A Unified Deep Neural Network Modeling Framework for Multi-Task Learning
    Lim, Heechul
    Chon, Kang-Wook
    Kim, Min-Soo
    [J]. IEEE ACCESS, 2023, 11 : 34297 - 34308
  • [10] Fuzzy min-max neural network for image segmentation
    Estévez, PA
    Ruz, GA
    Perez, CA
    [J]. PROCEEDINGS OF THE 7TH JOINT CONFERENCE ON INFORMATION SCIENCES, 2003, : 655 - 659