Missing well-log reconstruction using a sequence self-attention deep-learning framework

被引:0
|
作者
Lin, Lei [1 ]
Wei, Hao [1 ]
Wu, Tiantian [1 ]
Zhang, Pengyun [2 ]
Zhong, Zhi [1 ]
Li, Chenglong [1 ]
机构
[1] China Univ Geosci, Key Lab Theory & Technol Petr Explorat & Dev Hube, Wuhan, Peoples R China
[2] China Oilfield Serv Ltd, Well Tech R&D Inst, Beijing, Peoples R China
关键词
NEURAL-NETWORK; OIL-FIELD; PREDICTION; LITHOFACIES; POROSITY; FLUID; BASIN;
D O I
10.1190/GEO2022-0757.1
中图分类号
P3 [地球物理学]; P59 [地球化学];
学科分类号
0708 ; 070902 ;
摘要
Well logging is a critical tool for reservoir evaluation and fluid identification. However, due to borehole conditions, instrument failure, economic constraints, etc., some types of well logs are occasionally missing or unreliable. Existing logging curve reconstruction methods based on empirical formulas and fully connected deep neural networks (FCDNN) can only consider point-to-point mapping relationships. Recurrently structured neural networks can consider a multipoint correlation, but it is difficult to compute in parallel. To take into account the correlation between log sequences and achieve computational parallelism, we develop a novel deep-learning framework for missing well-log reconstruction based on state-of-the-art transformer architecture. The missing well-log transformer (MWLT) uses a self-attention mechanism instead of a circular recursive structure to model the global dependencies of the inputs and outputs. To use different usage requirements, we design the MWLT in three scales: small, base, and large, by adjusting the parameters in the network. A total of 8609 samples from 209 wells in the Sichuan Basin, China, are used for training and validation, and two additional blind wells are used for testing. The data augmentation strategy with random starting points is implemented to increase the robustness of the model. The results show that our proposed MWLT achieves a significant improvement in accuracy over the conventional Gardner's equation and data-driven approaches such as FCDNN and bidirectional long short-term memory, on the validation data set and blind test wells. The MWLT-large and MWLT-base have lower prediction errors than MWLT-small but require more training time. Two wells in the Songliao Basin, China, are used to evaluate the cross-regional generalized performance of our method. The generalizability test results demonstrate that density logs reconstructed by MWLT remain the best match to the observed data compared with other methods. The parallelizable MWLT automatically learns the global dependence of the parameters of the subsurface reservoir, enabling an efficient missing well-log reconstruction performance.
引用
收藏
页码:D391 / D410
页数:20
相关论文
共 50 条
  • [31] Well-Log Information-Assisted High-Resolution Waveform Inversion Based on Deep Learning
    Yang, Senlin
    Alkhalifah, Tariq
    Ren, Yuxiao
    Liu, Bin
    Li, Yuanyuan
    Jiang, Peng
    IEEE GEOSCIENCE AND REMOTE SENSING LETTERS, 2023, 20
  • [32] VeerNet: Using Deep Neural Networks for Curve Classification and Digitization of Raster Well-Log Images
    Nasim, M. Quamer
    Patwardhan, Narendra
    Maiti, Tannistha
    Marrone, Stefano
    Singh, Tarry
    JOURNAL OF IMAGING, 2023, 9 (07)
  • [33] Mineral Prospectivity Mapping Using Deep Self-Attention Model
    Bojun Yin
    Renguang Zuo
    Siquan Sun
    Natural Resources Research, 2023, 32 : 37 - 56
  • [34] Deep Transfer Learning With Self-Attention for Industry Sensor Fusion Tasks
    Zhang, Ze
    Farnsworth, Michael
    Song, Boyang
    Tiwari, Divya
    Tiwari, Ashutosh
    IEEE SENSORS JOURNAL, 2022, 22 (15) : 15235 - 15247
  • [35] Compressive sensing image reconstruction based on deep unfolding self-attention network
    Tian, Jin-Peng
    Hou, Bao-Jun
    Jilin Daxue Xuebao (Gongxueban)/Journal of Jilin University (Engineering and Technology Edition), 2024, 54 (10): : 3018 - 3026
  • [36] Well-Log Information-Assisted High-Resolution Waveform Inversion Based on Deep Learning
    Yang, Senlin
    Alkhalifah, Tariq
    Ren, Yuxiao
    Liu, Bin
    Li, Yuanyuan
    Jiang, Peng
    IEEE GEOSCIENCE AND REMOTE SENSING LETTERS, 2023, 20
  • [37] Magnetotelluric Data Inversion Based on Deep Learning With the Self-Attention Mechanism
    Xu, Kaijun
    Liang, Shuyuan
    Lu, Yan
    Hu, Zuzhi
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2024, 62
  • [38] Mineral Prospectivity Mapping Using Deep Self-Attention Model
    Yin, Bojun
    Zuo, Renguang
    Sun, Siquan
    NATURAL RESOURCES RESEARCH, 2023, 32 (01) : 37 - 56
  • [39] Traffic Signal Control with Deep Reinforcement Learning and Self-attention Mechanism
    Zhang X.
    Nie S.
    Li Z.
    Zhang H.
    Jiaotong Yunshu Xitong Gongcheng Yu Xinxi/Journal of Transportation Systems Engineering and Information Technology, 2024, 24 (02): : 96 - 104
  • [40] High-Fidelity Permeability and Porosity Prediction Using Deep Learning With the Self-Attention Mechanism
    Yang, Liuqing
    Wang, Shoudong
    Chen, Xiaohong
    Chen, Wei
    Saad, Omar M.
    Zhou, Xu
    Nam Pham
    Geng, Zhicheng
    Fomel, Sergey
    Chen, Yangkang
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, 34 (07) : 3429 - 3443