On sparse regression, Lp-regularization, and automated model discovery

被引:11
|
作者
McCulloch, Jeremy A. [1 ]
St Pierre, Skyler R. [1 ]
Linka, Kevin [2 ]
Kuhl, Ellen [1 ,3 ]
机构
[1] Stanford Univ, Dept Mech Engn & Bioengn, Stanford, CA USA
[2] Hamburg Univ Technol, Inst Continuum & Mat Mech, Hamburg, Germany
[3] Stanford Univ, Dept Mech Engn, 452 Escondido Mall, Stanford, CA 94305 USA
基金
美国国家科学基金会;
关键词
automated model discovery; constitutive modeling; hyperelasticity; Lp regularization; sparse regression; DEFORMATION; ELASTICITY; SELECTION;
D O I
10.1002/nme.7481
中图分类号
T [工业技术];
学科分类号
08 ;
摘要
Sparse regression and feature extraction are the cornerstones of knowledge discovery from massive data. Their goal is to discover interpretable and predictive models that provide simple relationships among scientific variables. While the statistical tools for model discovery are well established in the context of linear regression, their generalization to nonlinear regression in material modeling is highly problem-specific and insufficiently understood. Here we explore the potential of neural networks for automatic model discovery and induce sparsity by a hybrid approach that combines two strategies: regularization and physical constraints. We integrate the concept of L-p regularization for subset selection with constitutive neural networks that leverage our domain knowledge in kinematics and thermodynamics. We train our networks with both, synthetic and real data, and perform several thousand discovery runs to infer common guidelines and trends: L-2 regularization or ridge regression is unsuitable for model discovery; L-1 regularization or lasso promotes sparsity, but induces strong bias that may aggressively change the results; only L-0 regularization allows us to transparently fine-tune the trade-off between interpretability and predictability, simplicity and accuracy, and bias and variance. With these insights, we demonstrate that L-p regularized constitutive neural networks can simultaneously discover both, interpretable models and physically meaningful parameters. We anticipate that our findings will generalize to alternative discovery techniques such as sparse and symbolic regression, and to other domains such as biology, chemistry, or medicine. Our ability to automatically discover material models from data could have tremendous applications in generative material design and open new opportunities to manipulate matter, alter properties of existing materials, and discover new materials with user-defined properties.
引用
收藏
页数:33
相关论文
共 50 条
  • [21] SLOPE for Sparse Linear Regression: Asymptotics and Optimal Regularization
    Hu, Hong
    Lu, Yue M.
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2022, 68 (11) : 7627 - 7664
  • [22] Sparse logistic regression with Lp penalty for biomarker identification
    Liu, Zhenqiu
    Jiang, Feng
    Tian, Guoliang
    Wang, Suna
    Sato, Fumiaki
    Meltzer, Stephen J.
    Tan, Ming
    STATISTICAL APPLICATIONS IN GENETICS AND MOLECULAR BIOLOGY, 2007, 6
  • [23] A distributed sparse logistic regression with L1/2 regularization for microarray biomarker discovery in cancer classification
    Ai, Ning
    Yang, Ziyi
    Yuan, Haoliang
    Ouyang, Dong
    Miao, Rui
    Ji, Yuhan
    Liang, Yong
    SOFT COMPUTING, 2023, 27 (05) : 2537 - 2552
  • [24] Boosting the Model Discovery of Hybrid Dynamical Systems in an Informed Sparse Regression Approach
    Novelli, Nico
    Lenci, Stefano
    Belardinelli, Pierpaolo
    JOURNAL OF COMPUTATIONAL AND NONLINEAR DYNAMICS, 2022, 17 (05):
  • [25] BOOSTING THE MODEL DISCOVERY OF HYBRID DYNAMICAL SYSTEMS IN AN INFORMED SPARSE REGRESSION APPROACH
    Novelli, Nico
    Lenci, Stefano
    Belardinelli, Pierpaolo
    PROCEEDINGS OF ASME 2021 INTERNATIONAL DESIGN ENGINEERING TECHNICAL CONFERENCES AND COMPUTERS AND INFORMATION IN ENGINEERING CONFERENCE, IDETC-CIE2021, VOL 9, 2021,
  • [26] Double Reweighted Sparse Regression and Graph Regularization for Hyperspectral Unmixing
    Wang, Si
    Huang, Ting-Zhu
    Zhao, Xi-Le
    Liu, Gang
    Cheng, Yougan
    REMOTE SENSING, 2018, 10 (07)
  • [27] Conditional Sparse lp-norm Regression With Optimal Probability
    Hainline, John
    Juba, Brendan
    Le, Hai S.
    Woodruff, David P.
    22ND INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 89, 2019, 89
  • [28] Nonsmooth Penalized Clustering via lp Regularized Sparse Regression
    Niu, Lingfeng
    Zhou, Ruizhi
    Tian, Yingjie
    Qi, Zhiquan
    Zhang, Peng
    IEEE TRANSACTIONS ON CYBERNETICS, 2017, 47 (06) : 1423 - 1433
  • [29] Non-convex lp regularization for sparse reconstruction of electrical impedance tomography
    Wang, Jing
    INVERSE PROBLEMS IN SCIENCE AND ENGINEERING, 2021, 29 (07) : 1032 - 1053
  • [30] Automated Local Regression Discontinuity Design Discovery
    Herlands, William
    McFowland, Edward, III
    Wilson, Andrew Gordon
    Neill, Daniel B.
    KDD'18: PROCEEDINGS OF THE 24TH ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY & DATA MINING, 2018, : 1512 - 1520