Hyper-parameter Optimization Using Continuation Algorithms

被引:1
|
作者
Rojas-Delgado, Jairo [1 ]
Jimenez, J. A. [2 ]
Bello, Rafael [3 ]
Lozano, J. A. [1 ,4 ]
机构
[1] Basque Ctr Appl Math, Bilbao, Spain
[2] Univ Ciencias Informat, Havana, Cuba
[3] Univ Cent Las Villas, Santa Clara, Cuba
[4] Univ Basque Country UPV EHU, Donosti, Intelligent Syst Grp, Donostia San Sebastian, Spain
来源
METAHEURISTICS, MIC 2022 | 2023年 / 13838卷
关键词
Hyper-parameter; Optimization; Continuation; Machine learning;
D O I
10.1007/978-3-031-26504-4_26
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Hyper-parameter optimization is a common task in many application areas and a challenging optimization problem. In this paper, we introduce an approach to search for hyper-parameters based on continuation algorithms that can be coupled with existing hyper-parameter optimization methods. Our continuation approach can be seen as a heuristic to obtain lower fidelity surrogates of the fitness function. In our experiments, we conduct hyper-parameter optimization of neural networks trained using a benchmark set of forecasting regression problems, where generalization from unseen data is required. Our results show a small but statistically significant improvement in accuracy with respect to the state-of-the-art without negatively affecting the execution time.
引用
收藏
页码:365 / 377
页数:13
相关论文
共 50 条
  • [1] Hyper-Parameter Optimization Using MARS Surrogate for Machine-Learning Algorithms
    Li, Yangyang
    Liu, Guangyuan
    Lu, Gao
    Jiao, Licheng
    Marturi, Naresh
    Shang, Ronghua
    IEEE TRANSACTIONS ON EMERGING TOPICS IN COMPUTATIONAL INTELLIGENCE, 2020, 4 (03): : 287 - 297
  • [2] Hyper-Parameter Optimization in Support Vector Machine on Unbalanced Datasets Using Genetic Algorithms
    Guido, Rosita
    Groccia, Maria Carmela
    Conforti, Domenico
    OPTIMIZATION IN ARTIFICIAL INTELLIGENCE AND DATA SCIENCES, 2022, : 37 - 47
  • [3] Random search for hyper-parameter optimization
    Département D'Informatique et de Recherche Opérationnelle, Université de Montréal, Montréal, QC, H3C 3J7, Canada
    J. Mach. Learn. Res., (281-305):
  • [4] Random Search for Hyper-Parameter Optimization
    Bergstra, James
    Bengio, Yoshua
    JOURNAL OF MACHINE LEARNING RESEARCH, 2012, 13 : 281 - 305
  • [5] Hyper-parameter Optimization for Latent Spaces
    Veloso, Bruno
    Caroprese, Luciano
    Konig, Matthias
    Teixeira, Sonia
    Manco, Giuseppe
    Hoos, Holger H.
    Gama, Joao
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, ECML PKDD 2021: RESEARCH TRACK, PT III, 2021, 12977 : 249 - 264
  • [6] Federated learning with hyper-parameter optimization
    Kundroo, Majid
    Kim, Taehong
    JOURNAL OF KING SAUD UNIVERSITY-COMPUTER AND INFORMATION SCIENCES, 2023, 35 (09)
  • [7] HYPER-PARAMETER OPTIMIZATION FOR CONVOLUTIONAL NEURAL NETWORK COMMITTEES BASED ON EVOLUTIONARY ALGORITHMS
    Bochinski, Erik
    Senst, Tobias
    Sikora, Thomas
    2017 24TH IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2017, : 3924 - 3928
  • [8] USING METAHEURISTICS FOR HYPER-PARAMETER OPTIMIZATION OF CONVOLUTIONAL NEURAL NETWORKS
    Bibaeva, Victoria
    2018 IEEE 28TH INTERNATIONAL WORKSHOP ON MACHINE LEARNING FOR SIGNAL PROCESSING (MLSP), 2018,
  • [9] Hyper-parameter Tuning using Genetic Algorithms for Software Effort Estimation
    Villalobos-Arias, Leonardo
    Quesada-Lopez, Christian
    Jenkins, Marcelo
    Murillo-Morera, Juan
    PROCEEDINGS OF 2021 16TH IBERIAN CONFERENCE ON INFORMATION SYSTEMS AND TECHNOLOGIES (CISTI'2021), 2021,
  • [10] Hyper-Parameter Optimization for Emotion Detection using Physiological Signals
    Albraikan, Amani
    Tobon, Diana P.
    El Saddik, Abdulmotaleb
    2018 IEEE INTERNATIONAL CONFERENCE ON PERVASIVE COMPUTING AND COMMUNICATIONS WORKSHOPS (PERCOM WORKSHOPS), 2018,