FLExIBO: A Decoupled Cost-Aware Multi-Objective Optimization Approach for Deep Neural Networks

被引:0
|
作者
Iqbal, Shahriar [1 ]
Su, Jianhai [1 ]
Kotthoff, Lars [2 ]
Jamshidi, Pooyan [1 ]
机构
[1] Univ South Carolina, Columbia, SC 29208 USA
[2] Univ Wyoming, Laramie, WY USA
基金
美国国家科学基金会;
关键词
ALGORITHM;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The design of machine learning systems often requires trading off different objectives, for example, prediction error and energy consumption for deep neural networks (DNNs). Typically, no single design performs well in all objectives; therefore, finding Pareto-optimal designs is of interest. The search for Pareto-optimal designs involves evaluating designs in an iterative process, and the measurements are used to evaluate an acquisition function that guides the search process. However, measuring different objectives incurs different costs. For example, the cost of measuring the prediction error of DNNs is orders of magnitude higher than that of measuring the energy consumption of a pre-trained DNN as it requires re-training the DNN. Current state-of-the-art methods do not consider this difference in objective evaluation cost, potentially incurring expensive evaluations of objective functions in the optimization process. In this paper, we develop a novel decoupled and cost-aware multi-objective optimization algorithm, which we call Flexible Multi-Objective Bayesian Optimization (FlexiBO) to address this issue. For evaluating each design, FlexiBO selects the objective with higher relative gain by weighting the improvement of the hypervolume of the Pareto region with the measurement cost of each objective. This strategy, therefore, balances the expense of collecting new information with the knowledge gained through objective evaluations, preventing FlexiBO from performing expensive measurements for little to no gain. We evaluate FlexiBO on seven state-of-the-art DNNs for image recognition, natural language processing (NLP), and speech-to-text translation. Our results indicate that, given the same total experimental budget, FlexiBO discovers designs with 4.8% to 12.4% lower hypervolume error than the best method in state-of-the-art multi-objective optimization.
引用
收藏
页码:645 / 682
页数:38
相关论文
共 50 条
  • [21] MOSP: Multi-Objective Sensitivity Pruning of Deep Neural Networks
    Sabih, Muhammad
    Mishra, Ashutosh
    Hannig, Frank
    Teich, Jürgen
    2022 IEEE 13TH INTERNATIONAL GREEN AND SUSTAINABLE COMPUTING CONFERENCE (IGSC), 2022, : 59 - 66
  • [22] Multi-Objective Optimization of Orchestra Scheduler for Traffic-Aware Networks
    Panda, Niharika
    Muthuraman, Supriya
    Elsts, Atis
    SMART CITIES, 2024, 7 (05): : 2542 - 2571
  • [23] A novel learning approach in deep spiking neural networks with multi-objective optimization algorithms for automatic digit speech recognition
    Hamian, Melika
    Faez, Karim
    Nazari, Soheila
    Sabeti, Malihe
    JOURNAL OF SUPERCOMPUTING, 2023, 79 (18): : 20263 - 20288
  • [24] A novel learning approach in deep spiking neural networks with multi-objective optimization algorithms for automatic digit speech recognition
    Melika Hamian
    Karim Faez
    Soheila Nazari
    Malihe Sabeti
    The Journal of Supercomputing, 2023, 79 : 20263 - 20288
  • [25] Probabilistic Sequential Multi-Objective Optimization of Convolutional Neural Networks
    Yin, Zixuan
    Gross, Warren
    Meyer, Brett H.
    PROCEEDINGS OF THE 2020 DESIGN, AUTOMATION & TEST IN EUROPE CONFERENCE & EXHIBITION (DATE 2020), 2020, : 1055 - 1060
  • [26] Multi-Objective Optimization for Size and Resilience of Spiking Neural Networks
    Dimovska, Mihaela
    Johnston, Travis
    Schuman, Catherine D.
    Mitchell, J. Parker
    Potok, Thomas E.
    2019 IEEE 10TH ANNUAL UBIQUITOUS COMPUTING, ELECTRONICS & MOBILE COMMUNICATION CONFERENCE (UEMCON), 2019, : 433 - 439
  • [27] A Multi-objective Particle Swarm Optimization for Neural Networks Pruning
    Wu, Tao
    Shi, Jiao
    Zhou, Deyun
    Lei, Yu
    Gong, Maoguo
    2019 IEEE CONGRESS ON EVOLUTIONARY COMPUTATION (CEC), 2019, : 570 - 577
  • [28] Neural Networks Designing Neural Networks: Multi-Objective Hyper-Parameter Optimization
    Smithson, Sean C.
    Yang, Guang
    Gross, Warren J.
    Meyer, Brett H.
    2016 IEEE/ACM INTERNATIONAL CONFERENCE ON COMPUTER-AIDED DESIGN (ICCAD), 2016,
  • [29] A 'Phylogeny-aware' Multi-objective Optimization Approach for Computing MSA
    Nayeem, Muhammad Ali
    Bayzid, Md. Shamsuzzoha
    Rahman, Atif Hasan
    Shahriyar, Rifat
    Rahman, M. Sohel
    PROCEEDINGS OF THE 2019 GENETIC AND EVOLUTIONARY COMPUTATION CONFERENCE (GECCO'19), 2019, : 577 - 585
  • [30] Cost-Aware Influence Maximization in Multi-Attribute Networks
    Litou, Iouliana
    Kalogeraki, Vana
    2020 IEEE INTERNATIONAL CONFERENCE ON BIG DATA (BIG DATA), 2020, : 533 - 542