Synthetic pre-training for neural-network interatomic potentials

被引:9
|
作者
Gardner, John L. A. [1 ]
Baker, Kathryn T. [1 ]
Deringer, Volker L. [1 ]
机构
[1] Univ Oxford, Dept Chem, Inorgan Chem Lab, Oxford OX1 3QR, England
来源
MACHINE LEARNING-SCIENCE AND TECHNOLOGY | 2024年 / 5卷 / 01期
基金
英国科研创新办公室; 英国工程与自然科学研究理事会;
关键词
machine learning; neural networks; synthetic data; atomistic simulations; molecular dynamics;
D O I
10.1088/2632-2153/ad1626
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Machine learning (ML) based interatomic potentials have transformed the field of atomistic materials modelling. However, ML potentials depend critically on the quality and quantity of quantum-mechanical reference data with which they are trained, and therefore developing datasets and training pipelines is becoming an increasingly central challenge. Leveraging the idea of 'synthetic' (artificial) data that is common in other areas of ML research, we here show that synthetic atomistic data, themselves obtained at scale with an existing ML potential, constitute a useful pre-training task for neural-network (NN) interatomic potential models. Once pre-trained with a large synthetic dataset, these models can be fine-tuned on a much smaller, quantum-mechanical one, improving numerical accuracy and stability in computational practice. We demonstrate feasibility for a series of equivariant graph-NN potentials for carbon, and we carry out initial experiments to test the limits of the approach.
引用
收藏
页数:11
相关论文
共 50 条
  • [21] Neural speech enhancement with unsupervised pre-training and mixture training
    Hao, Xiang
    Xu, Chenglin
    Xie, Lei
    NEURAL NETWORKS, 2023, 158 : 216 - 227
  • [22] Neural Graph Matching for Pre-training Graph Neural Networks
    Hou, Yupeng
    Hu, Binbin
    Zhao, Wayne Xin
    Zhang, Zhiqiang
    Zhou, Jun
    Wen, Ji-Rong
    PROCEEDINGS OF THE 2022 SIAM INTERNATIONAL CONFERENCE ON DATA MINING, SDM, 2022, : 172 - 180
  • [23] Neural-network interatomic potential for grain boundary structures and their energetics in silicon
    Yokoi, T.
    Noda, Y.
    Nakamura, A.
    Matsunaga, K.
    PHYSICAL REVIEW MATERIALS, 2020, 4 (01):
  • [24] Layer-wise Pre-training Mechanism Based on Neural Network for Epilepsy Detection
    Lin, Zichao
    Gu, Zhenghui
    Li, Yinghao
    Yu, Zhuliang
    Li, Yuanqing
    2020 12TH INTERNATIONAL CONFERENCE ON ADVANCED COMPUTATIONAL INTELLIGENCE (ICACI), 2020, : 224 - 227
  • [25] Selection of an artificial pre-training neural network for the classification of inland vessels based on their images
    Bobkowska, Katarzyna
    Bodus-Olkowska, Izabela
    SCIENTIFIC JOURNALS OF THE MARITIME UNIVERSITY OF SZCZECIN-ZESZYTY NAUKOWE AKADEMII MORSKIEJ W SZCZECINIE, 2021, 67 (139):
  • [26] A deep neural network model for Chinese toponym matching with geographic pre-training model
    Qiu, Qinjun
    Zheng, Shiyu
    Tian, Miao
    Li, Jiali
    Ma, Kai
    Tao, Liufeng
    Xie, Zhong
    INTERNATIONAL JOURNAL OF DIGITAL EARTH, 2024, 17 (01)
  • [27] A Pre-Training Strategy for Convolutional Neural Network Applied to Chinese Digital Gesture Recognition
    Li, Yawei
    Yang, Yuliang
    Chen, Yueyun
    Zhu, Mengyu
    PROCEEDINGS OF 2016 8TH IEEE INTERNATIONAL CONFERENCE ON COMMUNICATION SOFTWARE AND NETWORKS (ICCSN 2016), 2016, : 620 - 624
  • [28] SecBERT: Privacy-preserving pre-training based neural network inference system
    Huang, Hai
    Wang, Yongjian
    NEURAL NETWORKS, 2024, 172
  • [29] Multilingual Denoising Pre-training for Neural Machine Translation
    Liu, Yinhan
    Gu, Jiatao
    Goyal, Naman
    Li, Xian
    Edunov, Sergey
    Ghazvininejad, Marjan
    Lewis, Mike
    Zettlemoyer, Luke
    TRANSACTIONS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, 2020, 8 : 726 - 742
  • [30] Unsupervised Pre-training for Fully Convolutional Neural Networks
    Wiehman, Stiaan
    Kroon, Steve
    de Villiers, Hendrik
    2016 PATTERN RECOGNITION ASSOCIATION OF SOUTH AFRICA AND ROBOTICS AND MECHATRONICS INTERNATIONAL CONFERENCE (PRASA-ROBMECH), 2016,