Initializing Bayesian Hyperparameter Optimization via Meta-Learning

被引:0
|
作者
Feurer, Matthias [1 ]
Springenberg, Jost Tobias [1 ]
Hutter, Frank [1 ]
机构
[1] Univ Freiburg, Comp Sci Dept, Georges Kohler Allee 52, D-79110 Freiburg, Germany
关键词
SEARCH;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Model selection and hyperparameter optimization is crucial in applying machine learning to a novel dataset. Recently, a sub-community of machine learning has focused on solving this problem with Sequential Model-based Bayesian Optimization (SMBO), demonstrating substantial successes in many applications. However, for computationally expensive algorithms the overhead of hyperparameter optimization can still be prohibitive. In this paper we mimic a strategy human domain experts use: speed up optimization by starting from promising configurations that performed well on similar datasets. The resulting initialization technique integrates naturally into the generic SMBO framework and can be trivially applied to any SMBO method. To validate our approach, we perform extensive experiments with two established SMBO frameworks (Spearmint and SMAC) with complementary strengths; optimizing two machine learning frameworks on 57 datasets. Our initialization procedure yields mild improvements for low-dimensional hyperparameter optimization and substantially improves the state of the art for the more complex combined algorithm selection and hyperparameter optimization problem.
引用
收藏
页码:1128 / 1135
页数:8
相关论文
共 50 条
  • [31] Online Bayesian Meta-Learning for Cognitive Tracking Radar
    Thornton, Charles E.
    Buehrer, Richard M.
    Martone, Anthony F.
    IEEE TRANSACTIONS ON AEROSPACE AND ELECTRONIC SYSTEMS, 2023, 59 (05) : 6485 - 6500
  • [32] Continual Quality Estimation with Online Bayesian Meta-Learning
    Obamuyide, Abiola
    Fomicheva, Marina
    Specia, Lucia
    ACL-IJCNLP 2021: THE 59TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS AND THE 11TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING, VOL 2, 2021, : 190 - 197
  • [33] Efficient Memory Circuits Yield Analysis and Optimization Framework via Meta-Learning
    Wang, Ziqi
    Pang, Liang
    Shi, Xiao
    Shi, Longxing
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS II-EXPRESS BRIEFS, 2024, 71 (09) : 4196 - 4200
  • [34] Bayesian Hyperparameter Optimization for Machine Learning Based eQTL Analysis
    Quitadamo, Andrew
    Johnson, James
    Shi, Xinghua
    ACM-BCB' 2017: PROCEEDINGS OF THE 8TH ACM INTERNATIONAL CONFERENCE ON BIOINFORMATICS, COMPUTATIONAL BIOLOGY,AND HEALTH INFORMATICS, 2017, : 98 - 106
  • [35] Online Optimization Method of Learning Process for Meta-Learning
    Xu, Zhixiong
    Zhang, Wei
    Li, Ailin
    Zhao, Feifei
    Jing, Yuanyuan
    Wan, Zheng
    Cao, Lei
    Chen, Xiliang
    COMPUTER JOURNAL, 2023, 67 (05): : 1645 - 1651
  • [36] Few-shot classification via efficient meta-learning with hybrid optimization
    Jia, Jinfang
    Feng, Xiang
    Yu, Huiqun
    ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2024, 127
  • [37] Personalized Image Aesthetics Assessment via Meta-Learning With Bilevel Gradient Optimization
    Zhu, Hancheng
    Li, Leida
    Wu, Jinjian
    Zhao, Sicheng
    Ding, Guiguang
    Shi, Guangming
    IEEE TRANSACTIONS ON CYBERNETICS, 2022, 52 (03) : 1798 - 1811
  • [38] Meta-learning for evolutionary parameter optimization of classifiers
    Reif, Matthias
    Shafait, Faisal
    Dengel, Andreas
    MACHINE LEARNING, 2012, 87 (03) : 357 - 380
  • [39] Meta-learning approach to neural network optimization
    Kordik, Pavel
    Koutnik, Jan
    Drchal, Jan
    Kovarik, Oleg
    Cepek, Miroslav
    Snorek, Miroslav
    NEURAL NETWORKS, 2010, 23 (04) : 568 - 582
  • [40] Learning to Balance Local Losses via Meta-Learning
    Yoa, Seungdong
    Jeon, Minkyu
    Oh, Youngjin
    Kim, Hyunwoo J.
    IEEE ACCESS, 2021, 9 : 130834 - 130844