共 50 条
STACKED NETWORKS IMPROVE PHYSICS-INFORMED TRAINING: APPLICATIONS TO NEURAL NETWORKS AND DEEP OPERATOR NETWORKS
被引:1
|作者:
Howard, Amanda a.
[1
]
Murphy, Sarah h.
[1
,2
]
Ahmed, Shady e.
[1
]
Stinis, Panos
[1
,3
,4
]
机构:
[1] Pacific Northwest Natl Lab, Richland, WA 99354 USA
[2] Univ N Carolina, Charlotte, NC USA
[3] Univ Washington, Dept Appl Math, Seattle, WA 98105 USA
[4] Brown Univ, Div Appl Math, Providence, RI 02812 USA
来源:
基金:
芬兰科学院;
关键词:
Physics-informed neural networks;
deep operator networks;
physics- informed operator networks;
multifidelity;
neural networks;
FRAMEWORK;
D O I:
10.3934/fods.2024029
中图分类号:
O29 [应用数学];
学科分类号:
070104 ;
摘要:
Physics -informed neural networks and operator networks have shown promise for effectively solving equations modeling physical systems. However, these networks can be difficult or impossible to train accurately for some systems of equations. We present a novel multifidelity framework for stacking physics -informed neural networks and operator networks that facilitates training. We successively build a chain of networks, where the output at one step can act as a low -fidelity input for training the next step, gradually increasing the expressivity of the learned model. The equations imposed at each step of the iterative process can be the same or different (akin to simulated annealing). The iterative (stacking) nature of the proposed method allows us to progressively learn features of a solution that are hard to learn directly. Through benchmark problems including a nonlinear pendulum, the wave equation, and the viscous Burgers equation, we show how stacking can be used to improve the accuracy and reduce the required size of physics -informed neural networks and operator networks.
引用
收藏
页数:29
相关论文