On sparse regression, Lp-regularization, and automated model discovery

被引:11
|
作者
McCulloch, Jeremy A. [1 ]
St Pierre, Skyler R. [1 ]
Linka, Kevin [2 ]
Kuhl, Ellen [1 ,3 ]
机构
[1] Stanford Univ, Dept Mech Engn & Bioengn, Stanford, CA USA
[2] Hamburg Univ Technol, Inst Continuum & Mat Mech, Hamburg, Germany
[3] Stanford Univ, Dept Mech Engn, 452 Escondido Mall, Stanford, CA 94305 USA
基金
美国国家科学基金会;
关键词
automated model discovery; constitutive modeling; hyperelasticity; Lp regularization; sparse regression; DEFORMATION; ELASTICITY; SELECTION;
D O I
10.1002/nme.7481
中图分类号
T [工业技术];
学科分类号
08 ;
摘要
Sparse regression and feature extraction are the cornerstones of knowledge discovery from massive data. Their goal is to discover interpretable and predictive models that provide simple relationships among scientific variables. While the statistical tools for model discovery are well established in the context of linear regression, their generalization to nonlinear regression in material modeling is highly problem-specific and insufficiently understood. Here we explore the potential of neural networks for automatic model discovery and induce sparsity by a hybrid approach that combines two strategies: regularization and physical constraints. We integrate the concept of L-p regularization for subset selection with constitutive neural networks that leverage our domain knowledge in kinematics and thermodynamics. We train our networks with both, synthetic and real data, and perform several thousand discovery runs to infer common guidelines and trends: L-2 regularization or ridge regression is unsuitable for model discovery; L-1 regularization or lasso promotes sparsity, but induces strong bias that may aggressively change the results; only L-0 regularization allows us to transparently fine-tune the trade-off between interpretability and predictability, simplicity and accuracy, and bias and variance. With these insights, we demonstrate that L-p regularized constitutive neural networks can simultaneously discover both, interpretable models and physically meaningful parameters. We anticipate that our findings will generalize to alternative discovery techniques such as sparse and symbolic regression, and to other domains such as biology, chemistry, or medicine. Our ability to automatically discover material models from data could have tremendous applications in generative material design and open new opportunities to manipulate matter, alter properties of existing materials, and discover new materials with user-defined properties.
引用
收藏
页数:33
相关论文
共 50 条
  • [31] Sparse mzodeling using orthogonal forward regression with PRESS statistic and regularization
    Chen, S
    Hong, X
    Harris, CJ
    Sharkey, PM
    IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS PART B-CYBERNETICS, 2004, 34 (02): : 898 - 911
  • [32] Subspace quadratic regularization method for group sparse multinomial logistic regression
    Wang, Rui
    Xiu, Naihua
    Toh, Kim-Chuan
    COMPUTATIONAL OPTIMIZATION AND APPLICATIONS, 2021, 79 (03) : 531 - 559
  • [33] Hyperspectral unmixing using weighted sparse regression with total variation regularization
    Ren, Longfei
    Ma, Zheng
    Bovolo, Francesca
    Hu, Jianming
    Bruzzone, Lorenzo
    INTERNATIONAL JOURNAL OF REMOTE SENSING, 2022, 43 (15-16) : 6124 - 6151
  • [34] Subspace quadratic regularization method for group sparse multinomial logistic regression
    Rui Wang
    Naihua Xiu
    Kim-Chuan Toh
    Computational Optimization and Applications, 2021, 79 : 531 - 559
  • [35] Adaptive L0 Regularization for Sparse Support Vector Regression
    Christou, Antonis
    Artemiou, Andreas
    MATHEMATICS, 2023, 11 (13)
  • [36] lP Norm Independently Interpretable Regularization Based Sparse Coding for Highly Correlated Data
    Zhao, Haoli
    Ding, Shuxue
    Li, Xiang
    Zhao, Lingjun
    IEEE ACCESS, 2019, 7 : 53542 - 53554
  • [37] Sparse reconstruction of EMT based on compressed sensing and Lp regularization with the split Bregman method
    Liu, Xianglong
    Wang, Ying
    Li, Danyang
    Li, Linwei
    FLOW MEASUREMENT AND INSTRUMENTATION, 2023, 94
  • [38] Can Critical-Point Paths Under lp-Regularization (0 < p < 1) Reach the Sparsest Least Squares Solutions?
    Jeong, Kwangjin
    Yukawa, Masahiro
    Amari, Shun-ichi
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2014, 60 (05) : 2960 - 2968
  • [39] Regularization Parameter Selection for a Bayesian Group Sparse Multi-Task Regression Model with Application to Imaging Genomics
    Nathoo, Farouk S.
    Greenlaw, Keelin
    Lesperance, Mary
    2016 6TH INTERNATIONAL WORKSHOP ON PATTERN RECOGNITION IN NEUROIMAGING (PRNI), 2016, : 9 - 12
  • [40] Structural regularization in quadratic logistic regression model
    Jiang, He
    Dong, Yao
    KNOWLEDGE-BASED SYSTEMS, 2019, 163 : 842 - 857