Toward fully automated UED operation using two-stage machine learning model

被引:0
|
作者
Zhe Zhang
Xi Yang
Xiaobiao Huang
Timur Shaftan
Victor Smaluk
Minghao Song
Weishi Wan
Lijun Wu
Yimei Zhu
机构
[1] SLAC National Accelerator Laboratory,School of Physical Science and Technology
[2] National Synchrotron Light Source II,Condensed Matter Physics and Materials Science Division
[3] Brookhaven National Laboratory,undefined
[4] ShanghaiTech University,undefined
[5] Brookhaven National Laboratory,undefined
来源
关键词
D O I
暂无
中图分类号
学科分类号
摘要
To demonstrate the feasibility of automating UED operation and diagnosing the machine performance in real time, a two-stage machine learning (ML) model based on self-consistent start-to-end simulations has been implemented. This model will not only provide the machine parameters with adequate precision, toward the full automation of the UED instrument, but also make real-time electron beam information available as single-shot nondestructive diagnostics. Furthermore, based on a deep understanding of the root connection between the electron beam properties and the features of Bragg-diffraction patterns, we have applied the hidden symmetry as model constraints, successfully improving the accuracy of energy spread prediction by a factor of five and making the beam divergence prediction two times faster. The capability enabled by the global optimization via ML provides us with better opportunities for discoveries using near-parallel, bright, and ultrafast electron beams for single-shot imaging. It also enables directly visualizing the dynamics of defects and nanostructured materials, which is impossible using present electron-beam technologies.
引用
收藏
相关论文
共 50 条
  • [31] A multi-output two-stage locally regularized model construction method using the extreme learning machine
    Du, Dajun
    Li, Kang
    Li, Xue
    Fei, Minrui
    Wang, Haikuan
    NEUROCOMPUTING, 2014, 128 : 104 - 112
  • [32] Efficient Two-stage Model Retraining for Machine Unlearning
    Kim, Junyaup
    Woo, Simon S.
    2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION WORKSHOPS, CVPRW 2022, 2022, : 4360 - 4368
  • [33] An explainable two-stage machine learning approach for precipitation forecast
    Senocak, Ali Ulvi Galip
    Yilmaz, M. Tugrul
    Kalkan, Sinan
    Yucel, Ismail
    Amjad, Muhammad
    JOURNAL OF HYDROLOGY, 2023, 627
  • [34] Enhance The Performance Of Navigation: A Two-Stage Machine Learning Approach
    Fan, Yimin
    Wang, Zhiyuan
    Lin, Yuanpeng
    Tan, Haisheng
    2020 6TH INTERNATIONAL CONFERENCE ON BIG DATA COMPUTING AND COMMUNICATIONS (BIGCOM 2020), 2020, : 212 - 219
  • [35] A machine learning approach to two-stage adaptive robust optimization
    Bertsimas, Dimitris
    Kim, Cheol Woo
    EUROPEAN JOURNAL OF OPERATIONAL RESEARCH, 2024, 319 (01) : 16 - 30
  • [36] Equity Factor Timing: A Two-Stage Machine Learning Approach
    DiCiurcio, Kevin J.
    Wu, Boyu
    Xu, Fei
    Rodemer, Scott
    Wang, Qian
    JOURNAL OF PORTFOLIO MANAGEMENT, 2024, 50 (03): : 132 - 148
  • [37] A Novel Two-Stage Selection of Feature Subsets in Machine Learning
    Kamala, F. Rosita
    Thangaiah, P. Ranjit Jeba
    ENGINEERING TECHNOLOGY & APPLIED SCIENCE RESEARCH, 2019, 9 (03) : 4169 - 4175
  • [38] Two-stage Unsupervised Multiple Kernel Extreme Learning Machine
    Zhao, Guohan
    Xiang, Lingyun
    Zhu, Chengzhang
    Li, Feng
    2018 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2018, : 800 - 805
  • [39] A two-stage multiple-point conceptual model to predict river stage-discharge process using machine learning approaches
    Alizadeh, Farhad
    Gharamaleki, Alireza Faregh
    Jalilzadeh, Rasoul
    JOURNAL OF WATER AND CLIMATE CHANGE, 2021, 12 (01) : 278 - 295
  • [40] Fully automated classification of pulmonary nodules in positron emission tomography-computed tomography imaging using a two-stage multimodal learning approach
    Li, Tongtong
    Mao, Junfeng
    Yu, Jiandong
    Zhao, Ziyang
    Chen, Miao
    Yao, Zhijun
    Fang, Lei
    Hu, Bin
    QUANTITATIVE IMAGING IN MEDICINE AND SURGERY, 2024, 14 (08) : 5526 - 5540