Subspace enhanced active learning method for high-dimensional reliability analysis

被引:0
|
作者
Li, Yifan [1 ]
Xiang, Yongyong [1 ]
Pan, Baisong [1 ]
Shi, Luojie [2 ]
Chen, Qingli [1 ]
Weng, Weini [1 ]
机构
[1] Zhejiang Univ Technol, Coll Mech Engn, Hangzhou 310023, Zhejiang, Peoples R China
[2] Univ Elect Sci & Technol, Sch Mech & Elect Engn, Chengdu 611731, Sichuan, Peoples R China
关键词
Reliability analysis; Surrogate model; Dimensionality reduction; Active learning; Subspace enhancement; SLICED INVERSE REGRESSION; RESPONSE-SURFACE METHOD; STRUCTURAL RELIABILITY; REDUCTION METHOD; GAUSSIAN-PROCESSES; UNCERTAINTY; ALGORITHM; MODELS;
D O I
10.1007/s00158-025-03967-3
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Subspace-based surrogate modeling methods are extensively used for high-dimensional reliability analysis due to their ability to mitigate the "curse of dimensionality". The accuracy of the subspace fundamentally determines the credibility of the reliability analysis. However, existing methods may construct the subspace with poor accuracy because they reduce the dimensionality only based on responses and gradients of limited samples, making the results of the failure probability inaccurate. To tackle this issue, this paper proposes a novel high-dimensional reliability analysis method by combining subspace enhancement and active learning. First, we calculate a projection matrix using slice inverse regression to obtain an initial subspace. To calibrate the projection direction, a nested optimization strategy is developed to refine the parameters of the projection matrix using the manifold optimization technique. In the early stages of active learning, a similarity-preserving global sampling strategy is proposed to select samples that are most beneficial for enhancing dimensionality reduction accuracy. Once the dimensionality reduction reaches sufficient precision, a local sampling strategy is implemented to identify samples near the limit state boundary, enhancing the accuracy of failure probability estimation. Finally, the learning process is terminated using a stopping criterion based on the upper bound estimate of relative error. The performance of the proposed method is validated through two numerical cases and two engineering cases.
引用
收藏
页数:26
相关论文
共 50 条
  • [21] Improved Estimation of High-dimensional Additive Models Using Subspace Learning
    He, Shiyuan
    He, Kejun
    Huang, Jianhua Z.
    JOURNAL OF COMPUTATIONAL AND GRAPHICAL STATISTICS, 2022, 31 (03) : 866 - 876
  • [22] Communication-efficient Subspace Methods for High-dimensional Federated Learning
    Shi, Zai
    Eryilmaz, Atilla
    2021 17TH INTERNATIONAL CONFERENCE ON MOBILITY, SENSING AND NETWORKING (MSN 2021), 2021, : 543 - 550
  • [23] Subspace Estimation From Incomplete Observations: A High-Dimensional Analysis
    Wang, Chuang
    Eldar, Yonina C.
    Lu, Yue M.
    IEEE JOURNAL OF SELECTED TOPICS IN SIGNAL PROCESSING, 2018, 12 (06) : 1240 - 1252
  • [24] High-dimensional reliability analysis based on the improved number-theoretical method
    Gao, Kai
    Liu, Gang
    Tang, Wei
    APPLIED MATHEMATICAL MODELLING, 2022, 107 : 151 - 164
  • [25] A Compressed PCA Subspace Method for Anomaly Detection in High-Dimensional Data
    Ding, Qi
    Kolaczyk, Eric D.
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2013, 59 (11) : 7419 - 7433
  • [26] A feature group weighting method for subspace clustering of high-dimensional data
    Chen, Xiaojun
    Ye, Yunming
    Xu, Xiaofei
    Huang, Joshua Zhexue
    PATTERN RECOGNITION, 2012, 45 (01) : 434 - 446
  • [27] Random Subspace Method for high-dimensional regression with the R package regRSM
    Teisseyre, Pawel
    Klopotek, Robert A.
    Mielniczuk, Jan
    COMPUTATIONAL STATISTICS, 2016, 31 (03) : 943 - 972
  • [28] Random Subspace Method for high-dimensional regression with the R package regRSM
    Paweł Teisseyre
    Robert A. Kłopotek
    Jan Mielniczuk
    Computational Statistics, 2016, 31 : 943 - 972
  • [29] Constraining the parameters of high-dimensional models with active learning
    Sascha Caron
    Tom Heskes
    Sydney Otten
    Bob Stienen
    The European Physical Journal C, 2019, 79
  • [30] Constraining the parameters of high-dimensional models with active learning
    Caron, Sascha
    Heskes, Tom
    Otten, Sydney
    Stienen, Bob
    EUROPEAN PHYSICAL JOURNAL C, 2019, 79 (11):