Identification of sparse nonlinear controlled variables for near-optimal operation of chemical processes

被引:0
|
作者
Ma, Xie [1 ]
Guan, Hongwei [2 ]
Ye, Lingjian [3 ]
机构
[1] Ningbo Univ Finance & Econ, Ningbo, Peoples R China
[2] Zhejiang Business Technol Inst, Ningbo, Peoples R China
[3] Huzhou Univ, Sch Engn, Huzhou Key Lab Intelligent Sensing & Optimal Contr, Huzhou 313000, Peoples R China
基金
中国国家自然科学基金;
关键词
chemical process; feedback control; neural networks; optimization; regularization; SELF-OPTIMIZING CONTROL; BATCH;
D O I
10.1002/cjce.25514
中图分类号
TQ [化学工业];
学科分类号
0817 ;
摘要
For optimal operation of chemical processes, the selection of controlled variables plays an important role. A previous proposal is to approximate the necessary conditions of optimality (NCO) as the controlled variables, such that process optimality is automatically maintained by tracking constant zero setpoints. In this paper, we extend the NCO approximation method by identifying sparse nonlinear controlled variables, motivated by the fact that simplicity is always favoured for practical implementations. To this end, the l1$$ {l}_1 $$-regularization is employed to approximate the NCO, such that the controlled variables are maintained simple, even they are specified as nonlinear functions. The sparse controlled variables are solved using the proximal gradient method, implemented within a tailored Adam algorithm. Two case studies are provided to illustrate the proposed approach.
引用
收藏
页数:16
相关论文
共 50 条
  • [1] A Sparse Sampling Algorithm for Near-Optimal Planning in Large Markov Decision Processes
    Michael Kearns
    Yishay Mansour
    Andrew Y. Ng
    Machine Learning, 2002, 49 : 193 - 208
  • [2] A sparse sampling algorithm for near-optimal planning in large Markov decision processes
    Kearns, M
    Mansour, Y
    Ng, AY
    IJCAI-99: PROCEEDINGS OF THE SIXTEENTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOLS 1 & 2, 1999, : 1324 - 1331
  • [3] A sparse sampling algorithm for near-optimal planning in large Markov decision processes
    Kearns, M
    Mansour, Y
    Ng, AY
    MACHINE LEARNING, 2002, 49 (2-3) : 193 - 208
  • [4] Near-Optimal Sparse Adaptive Group Testing
    Tan, Nelvin
    Scarlett, Jonathan
    2020 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT), 2020, : 1420 - 1425
  • [5] Near-Optimal Phase Retrieval of Sparse Vectors
    Bandeira, Afonso S.
    Mixon, Dustin G.
    WAVELETS AND SPARSITY XV, 2013, 8858
  • [6] Near-optimal control of nonlinear systems with simultaneous controlled and random switches
    Busoniu, Lucian
    Daafouz, Jamal
    Morarescu, Irinel-Constantin
    IFAC PAPERSONLINE, 2019, 52 (11): : 268 - 273
  • [7] Near-optimal Nonlinear Regression Trees
    Bertsimas, Dimitris
    Dunn, Jack
    Wang, Yuchen
    OPERATIONS RESEARCH LETTERS, 2021, 49 (02) : 201 - 206
  • [8] Near-Optimal Sparse Allreduce for Distributed Deep Learning
    Li, Shigang
    Hoefler, Torsten
    PPOPP'22: PROCEEDINGS OF THE 27TH ACM SIGPLAN SYMPOSIUM ON PRINCIPLES AND PRACTICE OF PARALLEL PROGRAMMING, 2022, : 135 - 149
  • [9] For-All Sparse Recovery in Near-Optimal Time
    Gilbert, Anna C.
    Li, Yi
    Porat, Ely
    Strauss, Martin J.
    AUTOMATA, LANGUAGES, AND PROGRAMMING (ICALP 2014), PT I, 2014, 8572 : 538 - 550
  • [10] Near-Optimal Entrywise Sampling of Numerically Sparse Matrices
    Braverman, Vladimir
    Krauthgamer, Robert
    Krishnan, Aditya
    Sapir, Shay
    CONFERENCE ON LEARNING THEORY, VOL 134, 2021, 134 : 759 - 773