Neural Networks as Surrogate Models for Measurements in Optimization Algorithms

被引:0
|
作者
Holena, Martin [1 ,2 ]
Linke, David [1 ]
Rodemerck, Uwe [1 ]
Bajer, Lukas [2 ]
机构
[1] Leibniz Inst Catalysis, D-18059 Rostock, Germany
[2] Acad Sci Czech Republic, Inst Comp Sci, Prague 18207 8, Czech Republic
关键词
Functions evaluated via measurements; evolutionary optimization; surrogate modelling; neural networks; boosting;
D O I
暂无
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
The paper deals with surrogate modelling, a modern approach to the optimization of objective functions evaluated via measurements. The approach leads to a substantial decrease of time and costs of evaluation of the objective function, a property that is particularly attractive in evolutionary optimization. The paper recalls common strategies for using surrogate models in evolutionary optimization, and proposes two extensions to those strategies - extension to boosted surrogate models and extension to using a set of models. These are currently being implemented, in connection with surrogate modelling based on feed-forward neural networks, in a software tool for problem-tailored evolutionary optimization of catalytic materials. The paper presents results of experimentally testing already implemented parts and comparing boosted surrogate models with models without boosting, which clearly confirms the usefulness of both proposed extensions.
引用
收藏
页码:351 / 366
页数:16
相关论文
共 50 条
  • [21] Functional optimization of online algorithms in multilayer neural networks
    Vicente, R
    Caticha, N
    JOURNAL OF PHYSICS A-MATHEMATICAL AND GENERAL, 1997, 30 (17): : L599 - L605
  • [22] Using neural networks to speed up optimization algorithms
    Bazan, M
    Russenschuck, S
    EUROPEAN PHYSICAL JOURNAL-APPLIED PHYSICS, 2000, 12 (02): : 109 - 115
  • [23] A Dynamical View on Optimization Algorithms of Overparameterized Neural Networks
    Bu, Zhiqi
    Xu, Shiyun
    Chen, Kan
    24TH INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS (AISTATS), 2021, 130
  • [24] Differentially Private Optimization Algorithms for Deep Neural Networks
    Gylberth, Roan
    Adnan, Risman
    Yazid, Setiadi
    Basaruddin, T.
    2017 INTERNATIONAL CONFERENCE ON ADVANCED COMPUTER SCIENCE AND INFORMATION SYSTEMS (ICACSIS), 2017, : 387 - 393
  • [25] Hybrid Artificial Neural Networks: Models, Algorithms and Data
    Gutierrez, P. A.
    Hervas-Martinez, C.
    ADVANCES IN COMPUTATIONAL INTELLIGENCE, IWANN 2011, PT II, 2011, 6692 : 177 - 184
  • [26] Hyperparameter Optimization for Convolutional Neural Networks with Genetic Algorithms and Bayesian Optimization
    Puentes G, David E.
    Barrios H, Carlos J.
    Navaux, Philippe O. A.
    2022 IEEE LATIN AMERICAN CONFERENCE ON COMPUTATIONAL INTELLIGENCE (LA-CCI), 2022, : 131 - 135
  • [27] A critical review on intelligent optimization algorithms and surrogate models for conventional and unconventional reservoir production optimization
    Wang, Lian
    Yao, Yuedong
    Luo, Xiaodong
    Adenutsi, Caspar Daniel
    Zhao, Guoxiang
    Lai, Fengpeng
    FUEL, 2023, 350
  • [28] Neural Networks for Surrogate-assisted Evolutionary Optimization of Chemical Processes
    Janus, Tim
    Luebbers, Anne
    Engell, Sebastian
    2020 IEEE CONGRESS ON EVOLUTIONARY COMPUTATION (CEC), 2020,
  • [29] Design of Antenna Rapid Optimization Platform Based on Intelligent Algorithms and Surrogate Models
    Jin, Fan
    Dong, Jian
    Wang, Meng
    Wang, Shan
    2018 12TH INTERNATIONAL SYMPOSIUM ON ANTENNAS, PROPAGATION AND ELECTROMAGNETIC THEORY (ISAPE), 2018,
  • [30] Gravitational-wave surrogate models powered by artificial neural networks
    Khan, Sebastian
    Green, Rhys
    PHYSICAL REVIEW D, 2021, 103 (06)