Objective-sensitive principal component analysis for high-dimensional inverse problems

被引:2
|
作者
Elizarev, Maksim [1 ]
Mukhin, Andrei [1 ]
Khlyupin, Aleksey [1 ]
机构
[1] Moscow Inst Phys & Technol, Ctr Engn & Technol, 9 Institutskiy Per, Dolgoprudnyi 141701, Russia
关键词
Principal component analysis; Dimensionality reduction; Inverse problems; Optimization; History matching; Reservoir simulation; ENSEMBLE KALMAN FILTER; DIFFERENTIABLE PARAMETERIZATION; EFFICIENT; MODEL; REPRESENTATION; MEDIA;
D O I
10.1007/s10596-021-10081-y
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
We introduce a novel approach of data-driven dimensionality reduction for solving high-dimensional optimization problems, including history matching. Objective-Sensitive parameterization of the argument accounts for the corresponding change of objective function value. The result is achieved via an extension of the conventional loss function, which only quantifies approximation error over realizations. This paper contains three instances of such an approach based on Principal Component Analysis (PCA). Gradient-Sensitive PCA (GS-PCA) exploits a linear approximation of the objective function. Two other approaches solve the problem approximately within the framework of stationary perturbation theory (SPT). All the algorithms are verified and tested with a synthetic reservoir model. The results demonstrate improvements in parameterization quality regarding the reveal of the unconstrained objective function minimum. Also, we provide possible extensions and analyze the overall applicability of the Objective-Sensitive approach, which can be combined with modern parameterization techniques beyond PCA.
引用
收藏
页码:2019 / 2031
页数:13
相关论文
共 50 条
  • [21] High-dimensional inference with the generalized Hopfield model: Principal component analysis and corrections
    Cocco, S.
    Monasson, R.
    Sessak, V.
    PHYSICAL REVIEW E, 2011, 83 (05):
  • [22] High-dimensional covariance forecasting based on principal component analysis of high-frequency data
    Jian, Zhihong
    Deng, Pingjun
    Zhu, Zhican
    ECONOMIC MODELLING, 2018, 75 : 422 - 431
  • [23] Functional principal component model for high-dimensional brain imaging
    Zipunnikov, Vadim
    Caffo, Brian
    Yousem, David M.
    Davatzikos, Christos
    Schwartz, Brian S.
    Crainiceanu, Ciprian
    NEUROIMAGE, 2011, 58 (03) : 772 - 784
  • [24] CONVERGENCE AND PREDICTION OF PRINCIPAL COMPONENT SCORES IN HIGH-DIMENSIONAL SETTINGS
    Lee, Seunggeun
    Zou, Fei
    Wright, Fred A.
    ANNALS OF STATISTICS, 2010, 38 (06): : 3605 - 3629
  • [25] The effect of principal component analysis on machine learning accuracy with high-dimensional spectral data
    Howley, Tom
    Madden, Michael G.
    O'Connell, Marie-Louise
    Ryder, Alan G.
    KNOWLEDGE-BASED SYSTEMS, 2006, 19 (05) : 363 - 370
  • [26] Using principal component analysis for neural network high-dimensional potential energy surface
    Casier, Bastien
    Carniato, Stephane
    Miteva, Tsveta
    Capron, Nathalie
    Sisourat, Nicolas
    JOURNAL OF CHEMICAL PHYSICS, 2020, 152 (23):
  • [27] Evaluating the performance of sparse principal component analysis methods in high-dimensional data scenarios
    Bonner, Ashley J.
    Beyene, Joseph
    COMMUNICATIONS IN STATISTICS-SIMULATION AND COMPUTATION, 2017, 46 (05) : 3794 - 3811
  • [28] Principal Component Geostatistical Approach for large-dimensional inverse problems
    Kitanidis, P. K.
    Lee, J.
    WATER RESOURCES RESEARCH, 2014, 50 (07) : 5428 - 5443
  • [29] Plug-in Estimation in High-Dimensional Linear Inverse Problems: A Rigorous Analysis
    Fletcher, Alyson K.
    Pandit, Parthe
    Rangan, Sundeep
    Sarkar, Subrata
    Schniter, Philip
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [30] New high-dimensional indexing structure based on principal component sorting
    School of Computer Science and Engineering, Xidian Univ., Xi'an 710071, China
    Xi Tong Cheng Yu Dian Zi Ji Shu/Syst Eng Electron, 2006, 12 (1927-1931):