Machine Learning Potentials with the Iterative Boltzmann Inversion: Training to Experiment

被引:9
|
作者
Matin, Sakib [1 ,2 ,3 ]
Allen, Alice E. A. [2 ,3 ]
Smith, Justin [2 ,4 ]
Lubbers, Nicholas [5 ]
Jadrich, Ryan B. [2 ]
Messerly, Richard [2 ]
Nebgen, Benjamin [2 ]
Li, Ying Wai [5 ]
Tretiak, Sergei [2 ,3 ,6 ]
Barros, Kipton [2 ,3 ]
机构
[1] Boston Univ, Dept Phys, Boston, MA 02215 USA
[2] Los Alamos Natl Lab, Theoret Div, Los Alamos, NM 87546 USA
[3] Los Alamos Natl Lab, Ctr Nonlinear Studies, Los Alamos, NM 87546 USA
[4] NVIDIA Corp, Santa Clara, CA 95051 USA
[5] Los Alamos Natl Lab, Comp Computat & Stat Sci Div, Los Alamos, NM 87545 USA
[6] Los Alamos Natl Lab, Ctr Integrated Nanotechnol, Los Alamos, NM 87546 USA
关键词
SIMULATION; CHEMISTRY;
D O I
10.1021/acs.jctc.3c01051
中图分类号
O64 [物理化学(理论化学)、化学物理学];
学科分类号
070304 ; 081704 ;
摘要
Methodologies for training machine learning potentials (MLPs) with quantum-mechanical simulation data have recently seen tremendous progress. Experimental data have a very different character than simulated data, and most MLP training procedures cannot be easily adapted to incorporate both types of data into the training process. We investigate a training procedure based on iterative Boltzmann inversion that produces a pair potential correction to an existing MLP using equilibrium radial distribution function data. By applying these corrections to an MLP for pure aluminum based on density functional theory, we observe that the resulting model largely addresses previous overstructuring in the melt phase. Interestingly, the corrected MLP also exhibits improved performance in predicting experimental diffusion constants, which are not included in the training procedure. The presented method does not require autodifferentiating through a molecular dynamics solver and does not make assumptions about the MLP architecture. Our results suggest a practical framework for incorporating experimental data into machine learning models to improve the accuracy of molecular dynamics simulations.
引用
收藏
页码:1274 / 1281
页数:8
相关论文
共 50 条
  • [31] Generative and discriminative infinite restricted Boltzmann machine training
    Wang, Qianglong
    Gao, Xiaoguang
    Wan, Kaifang
    Hu, Zijian
    INTERNATIONAL JOURNAL OF INTELLIGENT SYSTEMS, 2022, 37 (10) : 7857 - 7887
  • [32] Toward Reliable and Transferable Machine Learning Potentials: Uniform Training by Overcoming Sampling Bias
    Jeong, Wonseok
    Lee, Kyuhyun
    Yoo, Dongsun
    Lee, Dongheon
    Han, Seungwu
    JOURNAL OF PHYSICAL CHEMISTRY C, 2018, 122 (39): : 22790 - 22795
  • [33] Metadynamics sampling in atomic environment space for collecting training data for machine learning potentials
    Dongsun Yoo
    Jisu Jung
    Wonseok Jeong
    Seungwu Han
    npj Computational Materials, 7
  • [34] Training machine-learning potentials for crystal structure prediction using disordered structures
    Hong, Changho
    Choi, Jeong Min
    Jeong, Wonseok
    Kang, Sungwoo
    Ju, Suyeon
    Lee, Kyeongpung
    Jung, Jisu
    Youn, Yong
    Han, Seungwu
    PHYSICAL REVIEW B, 2020, 102 (22)
  • [35] A Hessian-based assessment of atomic forces for training machine learning interatomic potentials
    Herbold, Marius
    Behler, Joerg
    JOURNAL OF CHEMICAL PHYSICS, 2022, 156 (11):
  • [36] SPICE, A Dataset of Drug-like Molecules and Peptides for Training Machine Learning Potentials
    Eastman, Peter
    Behara, Pavan Kumar
    Dotson, David L.
    Galvelis, Raimondas
    Herr, John E.
    Horton, Josh T.
    Mao, Yuezhi
    Chodera, John D.
    Pritchard, Benjamin P.
    Wang, Yuanqing
    De Fabritiis, Gianni
    Markland, Thomas E.
    SCIENTIFIC DATA, 2023, 10 (01)
  • [37] On-the-fly training of polynomial machine learning potentials in computing lattice thermal conductivity
    Togo, Atsushi
    Seko, Atsuto
    JOURNAL OF CHEMICAL PHYSICS, 2024, 160 (21):
  • [38] Metadynamics sampling in atomic environment space for collecting training data for machine learning potentials
    Yoo, Dongsun
    Jung, Jisu
    Jeong, Wonseok
    Han, Seungwu
    NPJ COMPUTATIONAL MATERIALS, 2021, 7 (01)
  • [39] SPICE, A Dataset of Drug-like Molecules and Peptides for Training Machine Learning Potentials
    Peter Eastman
    Pavan Kumar Behara
    David L. Dotson
    Raimondas Galvelis
    John E. Herr
    Josh T. Horton
    Yuezhi Mao
    John D. Chodera
    Benjamin P. Pritchard
    Yuanqing Wang
    Gianni De Fabritiis
    Thomas E. Markland
    Scientific Data, 10
  • [40] Iterative Machine Learning for Output Tracking
    Devasia, Santosh
    IEEE TRANSACTIONS ON CONTROL SYSTEMS TECHNOLOGY, 2019, 27 (02) : 516 - 526