Life-cycle production optimization with nonlinear constraints using a least-squares support-vector regression proxy

被引:0
|
作者
Almasov, Azad [1 ,2 ]
Nguyen, Quang M. [1 ]
Onur, Mustafa [1 ]
机构
[1] Univ Tulsa, 800 S Tucker Dr, Tulsa, OK 74104 USA
[2] PetroTel INC, 5700 Tennyson Pkwy Suite 500, Plano, TX 75024 USA
来源
关键词
Machine learning; Least-squares support-vector regression; Waterflooding optimization; Nonlinear constraints; Numerical optimization; RELATIVE PERMEABILITY; GRADIENT; OUTPUT;
D O I
10.1016/j.geoen.2024.213142
中图分类号
TE [石油、天然气工业]; TK [能源与动力工程];
学科分类号
0807 ; 0820 ;
摘要
When nonlinear constraints such as field liquid or water production rate, injection pressures, etc., as functions of time need to be honored in addition to linear ones, the life-cycle production optimization problem, a component of a closed loop reservoir management, becomes challenging and computationally expensive to perform using a high-fidelity reservoir simulator with the existing gradient-based methods using the adjoint or stochastic approximate gradient methods. Therefore, the objective of this study is to present computationally efficient methods for deterministic production optimization under nonlinear constraints using a kernel-based machine learning method where the cost function is the net present value (NPV). We use the least-squares support-vector regression (LSSVR) to approximate the NPV function. To achieve computational efficiency, we generate a set of output values of the NPV and nonlinear constraint functions, which are field liquid production rate (FLPR) and water production rate (FWPR) in this study, by running the high-fidelity simulator for a broad set of input design variables (well controls) and then using the collection of input/output data to train LSSVR proxy models to replace the high-fidelity simulator to compute NPV and nonlinear state constraint functions during iterations of sequential quadratic programming (SQP). To obtain improved (higher) estimated optimal NPV values, we use the existing so-called iterative sampling refinement (ISR) method to update the LSSVR proxy so that the updated proxy remains predictive toward promising regions of search space during the optimization. Direct and indirect ways of constructing LSSVR-based NPVs as well as different combinations of input data, including nonlinear state constraints and/or the bottomhole pressures (BHPs) and water injection rates, are tested as alternative feature spaces. The results obtained from our proposed LSSVR-based optimization methods are compared with those obtained from our in-house stochastic simplex approximate gradient (StoSAG)-based line-search SQP programming (LS-SQP-StoSAG) algorithm that uses directly a high-fidelity simulator to compute the gradients of the objective function and the nonlinear state functions with StoSAG for the Brugge reservoir model. The results show that nonlinear constrained optimization with the LSSVR ISR with SQP is computationally 3.25 fold more efficient than LS-SQP-StoSAG. In addition, the results show that constructing NPV indirectly using the field liquid and water rates for a waterflooding problem where inputs come from LSSVR proxies of the nonlinear state constraints requires significantly fewer training samples than the method constructing NPV directly from the NPVs computed from a high-fidelity simulator. LSSVR has advantages for its computational efficiency which is the main goal in our research, and robustness against overfitting especially for the cases where we have limited data. However, DNNs and random forest require large training size and require much more computational resources and longer training times. Also, Random Forests and Gradient Boosting Machines can be prone to overfitting and become computationally intensive.
引用
收藏
页数:19
相关论文
共 50 条
  • [31] Feed-Forward Controlling of Servo-Hydraulic Actuators Utilizing a Least-Squares Support-Vector Machine
    Sharghi, Amir Hossein
    Mohammadi, Reza Karami
    Farrokh, Mojtaba
    Zolfagharysaravi, Sina
    ACTUATORS, 2020, 9 (01)
  • [32] Robust Least-Squares Support Vector Machine Using Probabilistic Inference
    Lu, Xinjiang
    Bai, Yunxu
    IEEE TRANSACTIONS ON CYBERNETICS, 2022, 52 (06) : 4391 - 4399
  • [33] Nonlinear decoupling controller design based on least squares support vector regression
    文香军
    张雨浓
    阎威武
    许晓鸣
    Journal of Zhejiang University Science A(Science in Engineering), 2006, (02) : 275 - 284
  • [34] Nonlinear decoupling controller design based on least squares support vector regression
    Wen X.-J.
    Zhang Y.-N.
    Yan W.-W.
    Xu X.-M.
    J Zhejiang Univ: Sci, 2006, 2 (275-284): : 275 - 284
  • [35] Nonlinear Calibration of Thermocouple Sensor Based on Least Squares Support Vector Regression
    Zhang, Shengbo
    Dai, Qingling
    PROCEEDINGS OF THE 2015 5TH INTERNATIONAL CONFERENCE ON COMPUTER SCIENCES AND AUTOMATION ENGINEERING, 2016, 42 : 5 - 10
  • [36] A Nonlinear Adaptive Beamforming Algorithm Based on Least Squares Support Vector Regression
    Wang, Lutao
    Jin, Gang
    Li, Zhengzhou
    Xu, Hongbin
    SENSORS, 2012, 12 (09) : 12424 - 12436
  • [37] Wireless IoT and Cyber-Physical System for Health Monitoring Using Honey Badger Optimized Least-Squares Support-Vector Machine
    Premalatha, G.
    Bai, V. Thulasi
    WIRELESS PERSONAL COMMUNICATIONS, 2022, 124 (04) : 3013 - 3034
  • [38] Efficient optimization of hyper-parameters for least squares support vector regression
    Fischer, Andreas
    Langensiepen, Gerd
    Luig, Klaus
    Strasdat, Nico
    Thies, Thorsten
    OPTIMIZATION METHODS & SOFTWARE, 2015, 30 (06): : 1095 - 1108
  • [39] Wireless IoT and Cyber-Physical System for Health Monitoring Using Honey Badger Optimized Least-Squares Support-Vector Machine
    G. Premalatha
    V. Thulasi Bai
    Wireless Personal Communications, 2022, 124 : 3013 - 3034
  • [40] Interval analysis using least squares support vector fuzzy regression
    Yongqi Chen
    Qijun Chen
    Chen, Y. (chenyongqi@nbu.edu.cn), 1600, South China University of Technology (10): : 458 - 464