Autotune: A Derivative-free Optimization Framework for Hyperparameter Tuning

被引:60
|
作者
Koch, Patrick [1 ]
Golovidov, Oleg [1 ]
Gardner, Steven [1 ]
Wujek, Brett [1 ]
Griffin, Joshua [1 ]
Xu, Yan [1 ]
机构
[1] SAS Inst Inc, Cary, NC 27513 USA
关键词
Derivative-free Optimization; Stochastic Optimization; Bayesian Optimization; Hyperparameters; Distributed Computing System; SEARCH;
D O I
10.1145/3219819.3219837
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Machine learning applications often require hyperparameter tuning. The hyperparameters usually drive both the efficiency of the model training process and the resulting model quality. For hyperparameter tuning, machine learning algorithms are complex black-boxes. This creates a class of challenging optimization problems, whose objective functions tend to be nonsmooth, discontinuous, unpredictably varying in computational expense, and include continuous, categorical, and/or integer variables. Further, function evaluations can fail for a variety of reasons including numerical difficulties or hardware failures. Additionally, not all hyperparameter value combinations are compatible, which creates so called hidden constraints. Robust and efficient optimization algorithms are needed for hyper-parameter tuning. In this paper we present an automated parallel derivative-free optimization framework called Autotune, which combines a number of specialized sampling and search methods that are very effective in tuning machine learning models despite these challenges. Autotune provides significantly improved models over using default hyperparameter settings with minimal user interaction on real-world applications. Given the inherent expense of training numerous candidate models, we demonstrate the effectiveness of Autotune's search methods and the efficient distributed and parallel paradigms for training and tuning models, and also discuss the resource trade-offs associated with the ability to both distribute the training process and parallelize the tuning process.
引用
收藏
页码:443 / 452
页数:10
相关论文
共 50 条
  • [1] Tuning BARON using derivative-free optimization algorithms
    Liu, Jianfeng
    Ploskas, Nikolaos
    Sahinidis, Nikolaos V.
    JOURNAL OF GLOBAL OPTIMIZATION, 2019, 74 (04) : 611 - 637
  • [2] Tuning BARON using derivative-free optimization algorithms
    Jianfeng Liu
    Nikolaos Ploskas
    Nikolaos V. Sahinidis
    Journal of Global Optimization, 2019, 74 : 611 - 637
  • [3] A computational framework for derivative-free optimization of cardiovascular geometries
    Marsden, Alison L.
    Feinstein, Jeffrey A.
    Taylor, Charles A.
    COMPUTER METHODS IN APPLIED MECHANICS AND ENGINEERING, 2008, 197 (21-24) : 1890 - 1905
  • [4] HybridTuner: Tuning with Hybrid Derivative-Free Optimization Initialization Strategies
    Sauk, Benjamin
    Sahinidis, Nikolaos V.
    LEARNING AND INTELLIGENT OPTIMIZATION, LION 15, 2021, 12931 : 379 - 393
  • [5] Stochastic derivative-free optimization using a trust region framework
    Larson, Jeffrey
    Billups, Stephen C.
    COMPUTATIONAL OPTIMIZATION AND APPLICATIONS, 2016, 64 (03) : 619 - 645
  • [6] Stochastic derivative-free optimization using a trust region framework
    Jeffrey Larson
    Stephen C. Billups
    Computational Optimization and Applications, 2016, 64 : 619 - 645
  • [7] Decomposition in derivative-free optimization
    Kaiwen Ma
    Nikolaos V. Sahinidis
    Sreekanth Rajagopalan
    Satyajith Amaran
    Scott J Bury
    Journal of Global Optimization, 2021, 81 : 269 - 292
  • [8] Efficient derivative-free optimization
    Belitz, Paul
    Bewley, Thomas
    PROCEEDINGS OF THE 46TH IEEE CONFERENCE ON DECISION AND CONTROL, VOLS 1-14, 2007, : 5607 - 5612
  • [9] Decomposition in derivative-free optimization
    Ma, Kaiwen
    Sahinidis, Nikolaos V.
    Rajagopalan, Sreekanth
    Amaran, Satyajith
    Bury, Scott J.
    JOURNAL OF GLOBAL OPTIMIZATION, 2021, 81 (02) : 269 - 292
  • [10] SURVEY OF DERIVATIVE-FREE OPTIMIZATION
    Xi, Min
    Sun, Wenyu
    Chen, Jun
    NUMERICAL ALGEBRA CONTROL AND OPTIMIZATION, 2020, 10 (04): : 537 - 555