Three-Dimensional Velocity Field Interpolation Based on Attention Mechanism

被引:0
|
作者
Yao, Xingmiao [1 ,2 ]
Cui, Mengling [1 ]
Wang, Lian [1 ]
Li, Yangsiwei [1 ]
Zhou, Cheng [1 ]
Su, Mingjun [2 ,3 ]
Hu, Guangmin [1 ]
机构
[1] Univ Elect Sci & Technol China, Sch Resources & Environm, Chengdu 611731, Peoples R China
[2] Univ Elect Sci & Technol China, Ctr informat Geosci, Chengdu 611731, Peoples R China
[3] PetroChina, Res Inst Petr Explorat & Dev Northwest NWGI, Lanzhou 730020, Peoples R China
来源
APPLIED SCIENCES-BASEL | 2023年 / 13卷 / 24期
基金
中国国家自然科学基金;
关键词
three-dimensional interpolation; attention mechanism; transfer learning; dilated convolution;
D O I
10.3390/app132413045
中图分类号
O6 [化学];
学科分类号
0703 ;
摘要
The establishment of a three-dimensional velocity field is an essential step in seismic exploration, playing a crucial role in understanding complex underground geological structures. Accurate 3D velocity fields are significant for seismic imaging, observation system design, precise positioning of underground geological targets, structural interpretation, and reservoir prediction. Therefore, obtaining an accurate 3D velocity field is a focus and challenge in this field of study. To achieve intelligent interpolation of the 3D velocity field more accurately, we have built a network model based on the attention mechanism, JointA 3DUnet. Based on the traditional U-Net, we have added triple attention blocks and channel attention blocks to enhance dimension information interaction, while adapting to the different changes of geoscience data in horizontal and vertical directions. Moreover, the network also incorporates dilated convolution to enlarge the receptive field. During the training process, we introduced transfer learning to further enhance the network's performance for interpolation tasks. At the same time, our method is a deep learning interpolation algorithm based on an unsupervised model. It does not require a training set and learns information solely from the input data, automatically interpolating the missing velocity data at the missing positions. We tested our method on both synthetic and real data. The results show that, compared with traditional intelligent interpolation methods, our approach can effectively interpolate the three-dimensional velocity field. The SNR increased to 36.22 dB, and the pointwise relative error decreased to 0.89%.
引用
下载
收藏
页数:14
相关论文
共 50 条
  • [2] Diffusion velocity for a three-dimensional vorticity field
    J. R. Grant
    J. S. Marshall
    Theoretical and Computational Fluid Dynamics, 2005, 19 : 377 - 390
  • [3] Diffusion velocity for a three-dimensional vorticity field
    Grant, JR
    Marshall, JS
    THEORETICAL AND COMPUTATIONAL FLUID DYNAMICS, 2005, 19 (06) : 377 - 390
  • [4] Morphology-based three-dimensional interpolation
    Lee, TY
    Wang, WH
    IEEE TRANSACTIONS ON MEDICAL IMAGING, 2000, 19 (07) : 711 - 721
  • [5] Data field-based mechanism for three-dimensional thresholding
    Wu, Tao
    Qin, Kun
    NEUROCOMPUTING, 2012, 97 : 278 - 296
  • [6] Three-Dimensional Object Detection in Substation Operation Scene Based on Attention Mechanism
    Gao Wei
    He Boyang
    Zhang Ting
    Guo Meiqing
    Liu Jun
    Wang Huimin
    Zhang Xingzhong
    LASER & OPTOELECTRONICS PROGRESS, 2022, 59 (22)
  • [7] Hyperspectral image classification based on hybrid convolution with three-dimensional attention mechanism
    Zhao X.
    Niu J.
    Liu C.
    Xia Y.
    Xi Tong Gong Cheng Yu Dian Zi Ji Shu/Systems Engineering and Electronics, 2023, 45 (09): : 2673 - 2680
  • [8] Three-dimensional velocity field in a compressible mixing layer
    Gruber, Mark R.
    1600, (31):
  • [9] Numerical modelling the three-dimensional velocity field in the cyclone
    Vatin, N. I.
    Girgidov, A. A.
    Strelets, K. I.
    MAGAZINE OF CIVIL ENGINEERING, 2011, 23 (05): : 3 - 68
  • [10] Three-dimensional virtual try-on network based on attention mechanism and vision transformer
    Yuan T.
    Wang X.
    Luo W.
    Mei C.
    Wei J.
    Zhong Y.
    Fangzhi Xuebao/Journal of Textile Research, 2023, 44 (07): : 192 - 198