On sparse regression, Lp-regularization, and automated model discovery

被引:3
|
作者
McCulloch, Jeremy A. [1 ]
St Pierre, Skyler R. [1 ]
Linka, Kevin [2 ]
Kuhl, Ellen [1 ,3 ]
机构
[1] Stanford Univ, Dept Mech Engn & Bioengn, Stanford, CA USA
[2] Hamburg Univ Technol, Inst Continuum & Mat Mech, Hamburg, Germany
[3] Stanford Univ, Dept Mech Engn, 452 Escondido Mall, Stanford, CA 94305 USA
基金
美国国家科学基金会;
关键词
automated model discovery; constitutive modeling; hyperelasticity; Lp regularization; sparse regression; DEFORMATION; ELASTICITY; SELECTION;
D O I
10.1002/nme.7481
中图分类号
T [工业技术];
学科分类号
08 ;
摘要
Sparse regression and feature extraction are the cornerstones of knowledge discovery from massive data. Their goal is to discover interpretable and predictive models that provide simple relationships among scientific variables. While the statistical tools for model discovery are well established in the context of linear regression, their generalization to nonlinear regression in material modeling is highly problem-specific and insufficiently understood. Here we explore the potential of neural networks for automatic model discovery and induce sparsity by a hybrid approach that combines two strategies: regularization and physical constraints. We integrate the concept of L-p regularization for subset selection with constitutive neural networks that leverage our domain knowledge in kinematics and thermodynamics. We train our networks with both, synthetic and real data, and perform several thousand discovery runs to infer common guidelines and trends: L-2 regularization or ridge regression is unsuitable for model discovery; L-1 regularization or lasso promotes sparsity, but induces strong bias that may aggressively change the results; only L-0 regularization allows us to transparently fine-tune the trade-off between interpretability and predictability, simplicity and accuracy, and bias and variance. With these insights, we demonstrate that L-p regularized constitutive neural networks can simultaneously discover both, interpretable models and physically meaningful parameters. We anticipate that our findings will generalize to alternative discovery techniques such as sparse and symbolic regression, and to other domains such as biology, chemistry, or medicine. Our ability to automatically discover material models from data could have tremendous applications in generative material design and open new opportunities to manipulate matter, alter properties of existing materials, and discover new materials with user-defined properties.
引用
收藏
页数:33
相关论文
共 50 条
  • [1] Sparse Identification of ARX Model Based on Lp-Regularization
    Chen, Zhitao
    Liu, Zhixin
    2022 41ST CHINESE CONTROL CONFERENCE (CCC), 2022, : 1564 - 1569
  • [2] Optimality conditions for the constrained Lp-regularization
    Wang, Heng
    Li, Dong-Hui
    Zhang, Xiong-Ji
    Wu, Lei
    OPTIMIZATION, 2015, 64 (10) : 2183 - 2197
  • [3] Non-Lipschitz lp-Regularization and Box Constrained Model for Image Restoration
    Chen, Xiaojun
    Ng, Michael K.
    Zhang, Chao
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2012, 21 (12) : 4709 - 4721
  • [4] Optimality condition and iterative thresholding algorithm for lp-regularization problems
    Jiao, Hongwei
    Chen, Yongqiang
    Yin, Jingben
    SPRINGERPLUS, 2016, 5
  • [5] Sparse Adaptive Iteratively-Weighted Thresholding Algorithm (SAITA) for Lp-Regularization Using the Multiple Sub-Dictionary Representation
    Li, Yunyi
    Zhang, Jie
    Fan, Shangang
    Yang, Jie
    Xiong, Jian
    Cheng, Xiefeng
    Sari, Hikmet
    Adachi, Fumiyuki
    Gui, Guan
    SENSORS, 2017, 17 (12)
  • [6] Fuzzy Techniques Provide a Theoretical Explanation for the Heuristic lp-Regularization of Signals and Images
    Cervantes, Fernando
    Usevitch, Brian
    Valera, Leobardo
    Kreinovich, Vladik
    Kosheleva, Olga
    2016 IEEE INTERNATIONAL CONFERENCE ON FUZZY SYSTEMS (FUZZ-IEEE), 2016, : 1323 - 1327
  • [7] Lp-Norm-Based Sparse Regularization Model for License Plate Deblurring
    Zhao, Chenping
    Wang, Yingjun
    Jiao, Hongwei
    Yin, Jingben
    Li, Xuezhi
    IEEE ACCESS, 2020, 8 : 22072 - 22081
  • [8] Sparse Gaussian Process regression model based on ℓ1/2 regularization
    Peng Kou
    Feng Gao
    Applied Intelligence, 2014, 40 : 669 - 681
  • [9] Group Sparse Optimization via lp,q Regularization
    Hu, Yaohua
    Li, Chong
    Meng, Kaiwen
    Qin, Jing
    Yang, Xiaoqi
    JOURNAL OF MACHINE LEARNING RESEARCH, 2017, 18
  • [10] Optimal Seismic Reflectivity Inversion: Data-Driven lp -Loss-lq -Regularization Sparse Regression
    Li, Fangyu
    Xie, Rui
    Song, Wen-Zhan
    Chen, Hui
    IEEE GEOSCIENCE AND REMOTE SENSING LETTERS, 2019, 16 (05) : 806 - 810