Risk-Averse PID Tuning Based on Scenario Programming and Parallel Bayesian Optimization

被引:1
|
作者
He, Qihang [1 ]
Liu, Qingyuan [1 ]
Liang, Yangyang [2 ]
Lyu, Wenxiang [1 ]
Huang, Dexian [1 ]
Shang, Chao [1 ]
机构
[1] Tsinghua Univ, Beijing Natl Res Ctr Informat Sci & Technol, Dept Automat, Beijing 100084, Peoples R China
[2] Tsingyun Intelligence Co Ltd, Beijing 100085, Peoples R China
基金
中国国家自然科学基金;
关键词
LOOP;
D O I
10.1021/acs.iecr.4c03050
中图分类号
TQ [化学工业];
学科分类号
0817 ;
摘要
The pervasiveness of PID control in process industries stipulates the critical need for efficient autotuning techniques. Recently, the use of Bayesian optimization (BO) has been popularized to seek optimal PID parameters and automate the tuning procedure. To evaluate the overall risk-averse performance of PID controllers, scenario programming that considers a wide range of uncertain scenarios provides a systematic method, but induces extensive simulations and expensive computations. Parallel computing offers a viable method to address this issue, and thus we propose a novel parallel BO algorithm for the risk-averse tuning, which enjoys a higher efficiency in both surrogate modeling and surrogate optimization. For the latter, a multiacquisition-function strategy with diversity promotion is developed to generate widely scattered query points to parallelize experiments efficiently. For the former, a data-efficient stability-aware Gaussian process modeling strategy is designed, obviating the need for building an additional classifier as required by existing methods. Numerical examples and application to a real-world industrial bio-oil processing unit demonstrate that the proposed parallel BO algorithm considerably improves the efficiency of simulation-aided PID tuning and yields practically viable controller parameters under the risk-averse tuning framework.
引用
收藏
页码:564 / 574
页数:11
相关论文
共 50 条
  • [41] Stackelberg Game of Buyback Policy in Supply Chain with a Risk-Averse Retailer and a Risk-Averse Supplier Based on CVaR
    Zhou, Yanju
    Chen, Qian
    Chen, Xiaohong
    Wang, Zongrun
    PLOS ONE, 2014, 9 (09):
  • [42] Risk-averse policy optimization via risk-neutral policy optimization
    Bisi, Lorenzo
    Santambrogio, Davide
    Sandrelli, Federico
    Tirinzoni, Andrea
    Ziebart, Brian D.
    Restelli, Marcello
    ARTIFICIAL INTELLIGENCE, 2022, 311
  • [43] Toward Optimal Risk-Averse Configuration for HESS With CGANs-Based PV Scenario Generation
    Yang, Xiaodong
    He, Haibo
    Li, Jie
    Zhang, Youbing
    IEEE TRANSACTIONS ON SYSTEMS MAN CYBERNETICS-SYSTEMS, 2021, 51 (03): : 1779 - 1793
  • [44] Risk-averse stochastic programming approach for microgrid planning under uncertainty
    Narayan, Apurva
    Ponnambalam, Kumaraswamy
    RENEWABLE ENERGY, 2017, 101 : 399 - 408
  • [45] Risk-averse stochastic bilevel programming: An application to natural gas markets
    Jayadev, Gopika
    Leibowicz, Benjamin D.
    Bard, Jonathan F.
    Calci, Baturay
    COMPUTERS & INDUSTRIAL ENGINEERING, 2022, 169
  • [46] Evaluating policies in risk-averse multi-stage stochastic programming
    Kozmik, Vaclav
    Morton, David P.
    MATHEMATICAL PROGRAMMING, 2015, 152 (1-2) : 275 - 300
  • [47] Evaluating policies in risk-averse multi-stage stochastic programming
    Václav Kozmík
    David P. Morton
    Mathematical Programming, 2015, 152 : 275 - 300
  • [48] Risk-Averse Stochastic Programming vs. Adaptive Robust Optimization: A Virtual Power Plant Application
    Lima, Ricardo M.
    Conejo, Antonio J.
    Giraldi, Loic
    Le Maitre, Olivier
    Hoteit, Ibrahim
    Knio, Omar M.
    INFORMS JOURNAL ON COMPUTING, 2022, 34 (03) : 1795 - 1818
  • [49] Risk-Averse Trajectory Optimization via Sample Average Approximation
    Lew, Thomas
    Bonalli, Riccardo
    Pavone, Marco
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2024, 9 (02) : 1500 - 1507
  • [50] Adaptive sampling strategies for risk-averse stochastic optimization with constraints
    Beiser, Florian
    Keith, Brendan
    Urbainczyk, Simon
    Wohlmuth, Barbara
    IMA JOURNAL OF NUMERICAL ANALYSIS, 2023, 43 (06) : 3729 - 3765