Objective-sensitive principal component analysis for high-dimensional inverse problems

被引:2
|
作者
Elizarev, Maksim [1 ]
Mukhin, Andrei [1 ]
Khlyupin, Aleksey [1 ]
机构
[1] Moscow Inst Phys & Technol, Ctr Engn & Technol, 9 Institutskiy Per, Dolgoprudnyi 141701, Russia
关键词
Principal component analysis; Dimensionality reduction; Inverse problems; Optimization; History matching; Reservoir simulation; ENSEMBLE KALMAN FILTER; DIFFERENTIABLE PARAMETERIZATION; EFFICIENT; MODEL; REPRESENTATION; MEDIA;
D O I
10.1007/s10596-021-10081-y
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
We introduce a novel approach of data-driven dimensionality reduction for solving high-dimensional optimization problems, including history matching. Objective-Sensitive parameterization of the argument accounts for the corresponding change of objective function value. The result is achieved via an extension of the conventional loss function, which only quantifies approximation error over realizations. This paper contains three instances of such an approach based on Principal Component Analysis (PCA). Gradient-Sensitive PCA (GS-PCA) exploits a linear approximation of the objective function. Two other approaches solve the problem approximately within the framework of stationary perturbation theory (SPT). All the algorithms are verified and tested with a synthetic reservoir model. The results demonstrate improvements in parameterization quality regarding the reveal of the unconstrained objective function minimum. Also, we provide possible extensions and analyze the overall applicability of the Objective-Sensitive approach, which can be combined with modern parameterization techniques beyond PCA.
引用
下载
收藏
页码:2019 / 2031
页数:13
相关论文
共 50 条
  • [1] Objective-sensitive principal component analysis for high-dimensional inverse problems
    Maksim Elizarev
    Andrei Mukhin
    Aleksey Khlyupin
    Computational Geosciences, 2021, 25 : 2019 - 2031
  • [2] On principal component analysis for high-dimensional XCSR
    Behdad, Mohammad
    French, Tim
    Barone, Luigi
    Bennamoun, Mohammed
    EVOLUTIONARY INTELLIGENCE, 2012, 5 (02) : 129 - 138
  • [3] Principal component analysis for sparse high-dimensional data
    Raiko, Tapani
    Ilin, Alexander
    Karhunen, Juha
    NEURAL INFORMATION PROCESSING, PART I, 2008, 4984 : 566 - 575
  • [4] High-dimensional principal component analysis with heterogeneous missingness
    Zhu, Ziwei
    Wang, Tengyao
    Samworth, Richard J.
    JOURNAL OF THE ROYAL STATISTICAL SOCIETY SERIES B-STATISTICAL METHODOLOGY, 2022, 84 (05) : 2000 - 2031
  • [5] PRINCIPAL COMPONENT ANALYSIS IN VERY HIGH-DIMENSIONAL SPACES
    Lee, Young Kyung
    Lee, Eun Ryung
    Park, Byeong U.
    STATISTICA SINICA, 2012, 22 (03) : 933 - 956
  • [6] Test for high-dimensional outliers with principal component analysis
    Nakayama, Yugo
    Yata, Kazuyoshi
    Aoshima, Makoto
    JAPANESE JOURNAL OF STATISTICS AND DATA SCIENCE, 2024, : 739 - 766
  • [7] Forecasting High-Dimensional Covariance Matrices Using High-Dimensional Principal Component Analysis
    Shigemoto, Hideto
    Morimoto, Takayuki
    AXIOMS, 2022, 11 (12)
  • [8] High-dimensional robust principal component analysis and its applications
    Jiang, Xiaobo
    Gao, Jie
    Yang, Zhongming
    JOURNAL OF COMPUTATIONAL METHODS IN SCIENCES AND ENGINEERING, 2023, 23 (05) : 2303 - 2311
  • [9] Multilevel Functional Principal Component Analysis for High-Dimensional Data
    Zipunnikov, Vadim
    Caffo, Brian
    Yousem, David M.
    Davatzikos, Christos
    Schwartz, Brian S.
    Crainiceanu, Ciprian
    JOURNAL OF COMPUTATIONAL AND GRAPHICAL STATISTICS, 2011, 20 (04) : 852 - 873
  • [10] Sparse principal component based high-dimensional mediation analysis
    Zhao, Yi
    Lindquist, Martin A.
    Caffo, Brian S.
    COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2020, 142