Toward Stable, General Machine-Learned Models of the Atmospheric Chemical System

被引:26
|
作者
Kelp, Makoto M. [1 ]
Jacob, Daniel J. [1 ]
Kutz, J. Nathan [2 ]
Marshall, Julian D. [3 ]
Tessum, Christopher W. [4 ]
机构
[1] Harvard Univ, Dept Earth & Planetary Sci, 20 Oxford St, Cambridge, MA 02138 USA
[2] Univ Washington, Dept Appl Math, Seattle, WA 98195 USA
[3] Univ Washington, Dept Civil & Environm Engn, Seattle, WA 98195 USA
[4] Univ Illinois, Dept Civil & Environm Engn, Urbana, IL 61801 USA
关键词
machine learning; atmospheric chemical mechanism; model emulation; surrogate model; chemical mechanism; TROPOSPHERIC CHEMISTRY; NEURAL-NETWORKS; GAS; AIR; SENSITIVITY; IMPACTS; OZONE; AREA;
D O I
10.1029/2020JD032759
中图分类号
P4 [大气科学(气象学)];
学科分类号
0706 ; 070601 ;
摘要
Atmospheric chemistry models-components in models that simulate air pollution and climate change-are computationally expensive. Previous studies have shown that machine-learned atmospheric chemical solvers can be orders of magnitude faster than traditional integration methods but tend to suffer from numerical instability. Here, we present a modeling framework that reduces error accumulation compared to previous work while maintaining computational efficiency. Our approach is novel in that it (1) uses a recurrent training regime that results in extended (>1 week) simulations without exponential error accumulation and (2) can reversibly compress the number of modeled chemical species by >80% without further decreasing accuracy. We observe an similar to 260x speedup (similar to 1,900x with specialized hardware) compared to the traditional solver. We use random initial conditions in training to promote general applicability across a wide range of atmospheric conditions. For ozone (concentrations ranging from 0-70 ppb), our model predictions over a 24-hr simulation period match those of the reference solver with median error of 2.7 and <19 ppb error across 99% of simulations initialized with random noise. Error can be significantly higher in the remaining 1% of simulations, which include extreme concentration fluctuations simulated by the reference model. Results are similar for total particulate matter (median error of 16 and <32 mu g/m(3) across 99% of simulations with concentrations ranging from 0-150 mu g/m(3)). Finally, we discuss practical implications of our modeling framework and next steps for improvements. The machine learning models described here are not yet replacements for traditional chemistry solvers but represent a step toward that goal.
引用
收藏
页数:13
相关论文
共 50 条
  • [41] Machine-learned molecular mechanics force fields from large-scale quantum chemical data
    Takaba, Kenichiro
    Friedman, Anika J.
    Cavender, Chapin E.
    Behara, Pavan Kumar
    Pulido, Ivan
    Henry, Michael M.
    MacDermott-Opeskin, Hugo
    Iacovella, Christopher R.
    Nagle, Arnav M.
    Payne, Alexander Matthew
    Shirts, Michael R.
    Mobley, David L.
    Chodera, John D.
    Wang, Yuanqing
    [J]. CHEMICAL SCIENCE, 2024, 15 (32) : 12861 - 12878
  • [42] Machine-learned Image Analysis Models for Classifying Liver Fibrosis Stage from Magnetic Resonance Images
    Pierre, Timothy G. St.
    House, Michael J.
    Mian, Ajmal
    Bangma, Sander
    Burgess, Gary
    Standish, Richard A.
    Casey, Stephen
    Hornsey, Emma
    Angus, Peter W.
    [J]. HEPATOLOGY, 2015, 62 : 607A - 607A
  • [43] Biomolecular dynamics with machine-learned quantum-mechanical force fields trained on diverse chemical fragments
    Unke, Oliver T.
    Stoehr, Martin
    Ganscha, Stefan
    Unterthiner, Thomas
    Maennel, Hartmut
    Kashubin, Sergii
    Ahlin, Daniel
    Gastegger, Michael
    Medrano Sandonas, Leonardo
    Berryman, Joshua T.
    Tkatchenko, Alexandre
    Mueller, Klaus-Robert
    [J]. SCIENCE ADVANCES, 2024, 10 (14)
  • [44] Lightning Distance Estimation Using LF Lightning Radio Signals via Analytical and Machine-Learned Models
    de Sa, Andre L. Antunes
    Marshall, Robert A.
    [J]. IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2020, 58 (08): : 5892 - 5907
  • [45] Efficient high-dimensional variational data assimilation with machine-learned reduced-order models
    Maulik, Romit
    Rao, Vishwas
    Wang, Jiali
    Mengaldo, Gianmarco
    Constantinescu, Emil
    Lusch, Bethany
    Balaprakash, Prasanna
    Foster, Ian
    Kotamarthi, Rao
    [J]. GEOSCIENTIFIC MODEL DEVELOPMENT, 2022, 15 (08) : 3433 - 3445
  • [46] Machine-learned wearable sensors for real-time hand-motion recognition: toward practical applications
    Pyun, Kyung Rok
    Kwon, Kangkyu
    Yoo, Myung Jin
    Kim, Kyun Kyu
    Gong, Dohyeon
    Yeo, Woon-Hong
    Han, Seungyong
    Ko, Seung Hwan
    [J]. NATIONAL SCIENCE REVIEW, 2024, 11 (02)
  • [47] Machine-learned wearable sensors for real-time hand-motion recognition: toward practical applications
    Kyung Rok Pyun
    Kangkyu Kwon
    Myung Jin Yoo
    Kyun Kyu Kim
    Dohyeon Gong
    Woon-Hong Yeo
    Seungyong Han
    Seung Hwan Ko
    [J]. National Science Review, 2024, 11 (02) : 231 - 263
  • [48] Calibrating hypersonic turbulence flow models with the HIFiRE-1 experiment using data-driven machine-learned models
    Chowdhary, Kenny
    Hoang, Chi
    Lee, Kookjin
    Ray, Jaideep
    Weirs, V. G.
    Carnes, Brian
    [J]. COMPUTER METHODS IN APPLIED MECHANICS AND ENGINEERING, 2022, 401
  • [49] Beyond RMSE: Do Machine-Learned Models of Road User Interaction Produce Human-Like Behavior?
    Srinivasan, Aravinda Ramakrishnan
    Lin, Yi-Shin
    Antonello, Morris
    Knittel, Anthony
    Hasan, Mohamed
    Hawasly, Majd
    Redford, John
    Ramamoorthy, Subramanian
    Leonetti, Matteo
    Billington, Jac
    Romano, Richard
    Markkula, Gustav
    [J]. IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, 2023, 24 (07) : 7166 - 7177
  • [50] Stable Solid Molecular Hydrogen above 900 K from a Machine-Learned Potential Trained with Diffusion Quantum Monte Carlo
    Niu, Hongwei
    Yang, Yubo
    Jensen, Scott
    Holzmann, Markus
    Pierleoni, Carlo
    Ceperley, David M.
    [J]. PHYSICAL REVIEW LETTERS, 2023, 130 (07)