High Dimensional Model Representation With Principal Component Analysis

被引:27
|
作者
Hajikolaei, Kambiz Haji [1 ]
Wang, G. Gary [1 ]
机构
[1] Simon Fraser Univ, Sch Mechatron Syst Engn, PDOL, Surrey, BC V3T 0A3, Canada
关键词
high dimension; large scale; metamodeling; HDMR; principal component analysis; sampling; RS-HDMR; ALGORITHM;
D O I
10.1115/1.4025491
中图分类号
TH [机械、仪表工业];
学科分类号
0802 ;
摘要
In engineering design, spending excessive amount of time on physical experiments or expensive simulations makes the design costly and lengthy. This issue exacerbates when the design problem has a large number of inputs, or of high dimension. High dimensional model representation (HDMR) is one powerful method in approximating high dimensional, expensive, black-box (HEB) problems. One existing HDMR implementation, random sampling HDMR (RS-HDMR), can build an HDMR model from random sample points with a linear combination of basis functions. The most critical issue in RS-HDMR is that calculating the coefficients for the basis functions includes integrals that are approximated by Monte Carlo summations, which are error prone with limited samples and especially with nonuniform sampling. In this paper, a new approach based on principal component analysis (PCA), called PCA-HDMR, is proposed for finding the coefficients that provide the best linear combination of the bases with minimum error and without using any integral. Several benchmark problems of different dimensionalities and one engineering problem are modeled using the method and the results are compared with RS-HDMR results. In all problems with both uniform and nonuniform sampling, PCA-HDMR built more accurate models than RS-HDMR for a given set of sample points.
引用
收藏
页数:11
相关论文
共 50 条
  • [41] Asymptotic Distribution of Studentized Contribution Ratio in High-Dimensional Principal Component Analysis
    Hyodo, Masashi
    Yamada, Takayuki
    COMMUNICATIONS IN STATISTICS-SIMULATION AND COMPUTATION, 2009, 38 (04) : 905 - 917
  • [42] Objective-sensitive principal component analysis for high-dimensional inverse problems
    Maksim Elizarev
    Andrei Mukhin
    Aleksey Khlyupin
    Computational Geosciences, 2021, 25 : 2019 - 2031
  • [43] Cauchy robust principal component analysis with applications to high-dimensional data sets
    Fayomi, Aisha
    Pantazis, Yannis
    Tsagris, Michail
    Wood, Andrew T. A.
    STATISTICS AND COMPUTING, 2024, 34 (01)
  • [44] Objective-sensitive principal component analysis for high-dimensional inverse problems
    Elizarev, Maksim
    Mukhin, Andrei
    Khlyupin, Aleksey
    COMPUTATIONAL GEOSCIENCES, 2021, 25 (06) : 2019 - 2031
  • [45] Exploring high-dimensional biological data with sparse contrastive principal component analysis
    Boileau, Philippe
    Hejazi, Nima S.
    Dudoit, Sandrine
    BIOINFORMATICS, 2020, 36 (11) : 3422 - 3430
  • [46] Adaptive local Principal Component Analysis improves the clustering of high-dimensional data
    Migenda, Nico
    Moeller, Ralf
    Schenck, Wolfram
    PATTERN RECOGNITION, 2024, 146
  • [47] Cauchy robust principal component analysis with applications to high-dimensional data sets
    Aisha Fayomi
    Yannis Pantazis
    Michail Tsagris
    Andrew T. A. Wood
    Statistics and Computing, 2024, 34
  • [48] Principal Component Analysis for High Dimension Stochastic Gaussian Process Model Fitting
    Xuereb, M.
    Huo, T. M.
    Ng, S. H.
    2019 IEEE INTERNATIONAL CONFERENCE ON INDUSTRIAL ENGINEERING AND ENGINEERING MANAGEMENT (IEEM), 2019, : 632 - 636
  • [49] A New Differentiable Parameterization Based on Principal Component Analysis for the Low-Dimensional Representation of Complex Geological Models
    Vo, Hai X.
    Durlofsky, Louis J.
    MATHEMATICAL GEOSCIENCES, 2014, 46 (07) : 775 - 813
  • [50] High-dimensional covariance forecasting based on principal component analysis of high-frequency data
    Jian, Zhihong
    Deng, Pingjun
    Zhu, Zhican
    ECONOMIC MODELLING, 2018, 75 : 422 - 431