Life-cycle production optimization with nonlinear constraints using a least-squares support-vector regression proxy

被引:0
|
作者
Almasov, Azad [1 ,2 ]
Nguyen, Quang M. [1 ]
Onur, Mustafa [1 ]
机构
[1] Univ Tulsa, 800 S Tucker Dr, Tulsa, OK 74104 USA
[2] PetroTel INC, 5700 Tennyson Pkwy Suite 500, Plano, TX 75024 USA
来源
关键词
Machine learning; Least-squares support-vector regression; Waterflooding optimization; Nonlinear constraints; Numerical optimization; RELATIVE PERMEABILITY; GRADIENT; OUTPUT;
D O I
10.1016/j.geoen.2024.213142
中图分类号
TE [石油、天然气工业]; TK [能源与动力工程];
学科分类号
0807 ; 0820 ;
摘要
When nonlinear constraints such as field liquid or water production rate, injection pressures, etc., as functions of time need to be honored in addition to linear ones, the life-cycle production optimization problem, a component of a closed loop reservoir management, becomes challenging and computationally expensive to perform using a high-fidelity reservoir simulator with the existing gradient-based methods using the adjoint or stochastic approximate gradient methods. Therefore, the objective of this study is to present computationally efficient methods for deterministic production optimization under nonlinear constraints using a kernel-based machine learning method where the cost function is the net present value (NPV). We use the least-squares support-vector regression (LSSVR) to approximate the NPV function. To achieve computational efficiency, we generate a set of output values of the NPV and nonlinear constraint functions, which are field liquid production rate (FLPR) and water production rate (FWPR) in this study, by running the high-fidelity simulator for a broad set of input design variables (well controls) and then using the collection of input/output data to train LSSVR proxy models to replace the high-fidelity simulator to compute NPV and nonlinear state constraint functions during iterations of sequential quadratic programming (SQP). To obtain improved (higher) estimated optimal NPV values, we use the existing so-called iterative sampling refinement (ISR) method to update the LSSVR proxy so that the updated proxy remains predictive toward promising regions of search space during the optimization. Direct and indirect ways of constructing LSSVR-based NPVs as well as different combinations of input data, including nonlinear state constraints and/or the bottomhole pressures (BHPs) and water injection rates, are tested as alternative feature spaces. The results obtained from our proposed LSSVR-based optimization methods are compared with those obtained from our in-house stochastic simplex approximate gradient (StoSAG)-based line-search SQP programming (LS-SQP-StoSAG) algorithm that uses directly a high-fidelity simulator to compute the gradients of the objective function and the nonlinear state functions with StoSAG for the Brugge reservoir model. The results show that nonlinear constrained optimization with the LSSVR ISR with SQP is computationally 3.25 fold more efficient than LS-SQP-StoSAG. In addition, the results show that constructing NPV indirectly using the field liquid and water rates for a waterflooding problem where inputs come from LSSVR proxies of the nonlinear state constraints requires significantly fewer training samples than the method constructing NPV directly from the NPVs computed from a high-fidelity simulator. LSSVR has advantages for its computational efficiency which is the main goal in our research, and robustness against overfitting especially for the cases where we have limited data. However, DNNs and random forest require large training size and require much more computational resources and longer training times. Also, Random Forests and Gradient Boosting Machines can be prone to overfitting and become computationally intensive.
引用
收藏
页数:19
相关论文
共 50 条
  • [22] Life- Cycle Gradient-Based Production Optimization Including Well- Shutoff Option with Least- Squares Support Vector Regression
    Almasov, Azad
    Toktas, Omer Lutfu
    Onur, Mustafa
    SPE JOURNAL, 2024, 29 (10): : 5132 - 5150
  • [23] Feasible generalized least squares using support vector regression
    Miller, Steve
    Startz, Richard
    ECONOMICS LETTERS, 2019, 175 : 28 - 31
  • [24] An improved support vector regression using least squares method
    Cheng Yan
    Xiuli Shen
    Fushui Guo
    Structural and Multidisciplinary Optimization, 2018, 57 : 2431 - 2445
  • [25] An improved support vector regression using least squares method
    Yan, Cheng
    Shen, Xiuli
    Guo, Fushui
    STRUCTURAL AND MULTIDISCIPLINARY OPTIMIZATION, 2018, 57 (06) : 2431 - 2445
  • [26] Nonlinear temperature compensation of fluxgate magnetometers with a least-squares support vector machine
    Pang, Hongfeng
    Chen, Dixiang
    Pan, Mengchun
    Luo, Shitu
    Zhang, Qi
    Luo, Feilu
    MEASUREMENT SCIENCE AND TECHNOLOGY, 2012, 23 (02)
  • [27] A parametric model for cavity filter using multiple outputs least-squares fuzzy support vector regression
    Wu, Shengbiao
    Luo, Xianxi
    INTERNATIONAL JOURNAL OF RF AND MICROWAVE COMPUTER-AIDED ENGINEERING, 2022, 32 (10)
  • [28] Noise reduction and drift removal using least-squares support vector regression with the implicit bias term
    Deng, Xiaoying
    Yang, Dinghui
    Peng, Jiming
    Guan, Xin
    Yang, Baojun
    GEOPHYSICS, 2010, 75 (06) : V119 - V127
  • [29] Application of Least-Squares Support-Vector Machine Based on Hysteresis Operators and Particle Swarm Optimization for Modeling and Control of Hysteresis in Piezoelectric Actuators
    Baziyad, Ayad G.
    Nouh, Adnan S.
    Ahmad, Irfan
    Alkuhayli, Abdulaziz
    ACTUATORS, 2022, 11 (08)
  • [30] Lidar signal denoising using least-squares support vector machine
    Sun, BY
    Huang, DS
    Fang, HT
    IEEE SIGNAL PROCESSING LETTERS, 2005, 12 (02) : 101 - 104