Hard-constrained neural networks for modeling nonlinear acoustics

被引:2
|
作者
Ozan D.E. [1 ]
Magri L. [1 ,2 ]
机构
[1] Department of Aeronautics, Imperial College London, London
[2] Alan Turing Institute, London
基金
欧洲研究理事会;
关键词
Chemical activation;
D O I
10.1103/PhysRevFluids.8.103201
中图分类号
学科分类号
摘要
In this computational paper, we model acoustic dynamics in space and time from synthetic sensor data. The tasks are (i) to predict and extrapolate the spatiotemporal dynamics and (ii) to reconstruct the acoustic state from partial observations. To achieve this, we develop acoustic neural networks. These are networks that learn from sensor data, while being constrained by prior knowledge on acoustic and wave physics. The prior knowledge is constrained as a soft constraint, which informs the training, and as a hard constraint (Galerkin neural networks), which constrains parts of the network's architecture as an inductive bias. First, we show that standard feedforward neural networks are unable to extrapolate in time, even in the simplest case of periodic oscillations. This motivates the constraints on the prior knowledge. Second, we constrain the prior knowledge on acoustics in increasingly effective ways by (i) employing periodic activations (periodically activated neural networks), (ii) informing the training of the networks with a penalty term that favors solutions that fulfill the governing equations (soft constrained), (iii) constraining the architecture in a physically motivated solution space (hard constrained), and (iv) a combination of these. Third, we apply the networks on two test cases for two tasks in nonlinear regimes, from periodic to chaotic oscillations. The first test case is a twin experiment, in which the data are produced by a prototypical time-delayed model. In the second test case, the data are generated by a higher-fidelity model with mean-flow effects and a kinematic model for the flame source. We find that (i) constraining the physics in the architecture improves interpolation while requiring smaller network sizes, (ii) extrapolation in time is achieved by periodic activations, and (iii) velocity can be reconstructed accurately from only pressure measurements with a combination of physics-based hard and soft constraints. In acoustics and thermoacoustics, this works opens possibilities for physics-constrained data-driven modeling. Beyond acoustics, this work opens strategies for constraining the physics in the architecture, rather than the training. © 2023 authors. Published by the American Physical Society. Published by the American Physical Society under the terms of the "https://creativecommons.org/licenses/by/4.0/"Creative Commons Attribution 4.0 International license. Further distribution of this work must maintain attribution to the author(s) and the published article's title, journal citation, and DOI.
引用
收藏
相关论文
共 50 条
  • [1] Hard-Constrained Hopfield Neural Network for Subpixel Mapping
    Zhang, Chengyuan
    Wang, Qunming
    Atkinson, Peter M.
    [J]. IEEE Transactions on Geoscience and Remote Sensing, 2024, 62
  • [2] Hard-constrained signal feasibility problems
    Combettes, PL
    Bondon, P
    [J]. 1997 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, VOLS I - V: VOL I: PLENARY, EXPERT SUMMARIES, SPECIAL, AUDIO, UNDERWATER ACOUSTICS, VLSI; VOL II: SPEECH PROCESSING; VOL III: SPEECH PROCESSING, DIGITAL SIGNAL PROCESSING; VOL IV: MULTIDIMENSIONAL SIGNAL PROCESSING, NEURAL NETWORKS - VOL V: STATISTICAL SIGNAL AND ARRAY PROCESSING, APPLICATIONS, 1997, : 2569 - 2572
  • [3] Hard-constrained versus soft-constrained parameter estimation
    Benavoli, A.
    Chisci, L.
    Farina, A.
    Ortenzi, L.
    Zappa, G.
    [J]. IEEE TRANSACTIONS ON AEROSPACE AND ELECTRONIC SYSTEMS, 2006, 42 (04) : 1224 - 1239
  • [4] Hard-constrained inconsistent signal feasibility problems
    Combettes, PL
    Bondon, P
    [J]. IEEE TRANSACTIONS ON SIGNAL PROCESSING, 1999, 47 (09) : 2460 - 2468
  • [5] Hard-Constrained Deep Learning for Climate Downscaling
    Harder, Paula
    Hernandez-Garcia, Alex
    Ramesh, Venkatesh
    Yang, Qidong
    Sattegeri, Prasanna
    Szwarcman, Daniela
    Watson, Campbell D.
    Rolnick, David
    [J]. JOURNAL OF MACHINE LEARNING RESEARCH, 2023, 24
  • [6] Learning Hard-Constrained Models with One Sample
    Galanis, Andreas
    Kalavasis, Alkis
    Kandiros, Anthimos Vardis
    [J]. PROCEEDINGS OF THE 2024 ANNUAL ACM-SIAM SYMPOSIUM ON DISCRETE ALGORITHMS, SODA, 2024, : 3184 - 3196
  • [7] Stable neural network-based traveltime tomography using hard-constrained measurements
    Taufik, Mohammad H.
    Alkhalifah, Tariq
    bin Waheed, Umair
    [J]. Geophysics, 2024, 89 (06)
  • [8] MDM: Meta diffusion model for hard-constrained text generation
    Ke, Wenjun
    Guo, Yikai
    Liu, Qi
    Chen, Wanyi
    Wang, Peng
    Luo, Haoran
    Luo, Zhizhao
    [J]. KNOWLEDGE-BASED SYSTEMS, 2024, 283
  • [9] Extrapolating gain-constrained neural networks effective modeling for nonlinear control
    Sayyar-Rodsari, B
    Hartman, E
    Plumer, E
    Liano, K
    Schweiger, C
    [J]. 2004 43RD IEEE CONFERENCE ON DECISION AND CONTROL (CDC), VOLS 1-5, 2004, : 4964 - 4971
  • [10] Latent Variables Improve Hard-Constrained Controllable Text Generation on Weak Correlation
    Zhu, Weigang
    Liu, Xiaoming
    Yang, Guan
    Liu, Jie
    Qi, Haotian
    [J]. INTERNATIONAL JOURNAL OF ADVANCED COMPUTER SCIENCE AND APPLICATIONS, 2024, 15 (06) : 365 - 374