Mineral Prospectivity Mapping Using Deep Self-Attention Model

被引:28
|
作者
Yin, Bojun [1 ]
Zuo, Renguang [1 ]
Sun, Siquan [2 ]
机构
[1] China Univ Geosci, State Key Lab Geol Proc & Mineral Resources, Wuhan 430074, Peoples R China
[2] Hubei Geol Survey, Wuhan 430034, Peoples R China
基金
中国国家自然科学基金;
关键词
Mineral prospectivity mapping; Deep learning; Self-attention mechanism; Gold mineralization; KNOWLEDGE-DRIVEN METHOD; SPATIAL EVIDENCE; GREENSTONE-BELT; BAGUIO DISTRICT; REPRESENTATION;
D O I
10.1007/s11053-022-10142-8
中图分类号
P [天文学、地球科学];
学科分类号
07 ;
摘要
Multi-source data integration for mineral prospectivity mapping (MPM) is an effective approach for reducing uncertainty and improving MPM accuracy. Multi-source data (e.g., geological, geophysical, geochemical, remote sensing, and drilling) should first be identified as evidence layers that represent ore-prospecting-related features. Traditional methods for MPM often neglect the correlations between different evidence layers that vary with their spatial locations, which results in the loss of useful information when integrating them into a mineral potential map. In this study, a deep self-attention model was adopted to integrate multiple evidence layers supported by a self-attention mechanism that can capture the internal relationships between various evidence layers and consider the spatial heterogeneity simultaneously. The attention matrix of the self-attention mechanism was further visualized to improve the interpretability of the proposed deep neural network model. A case study was conducted to demonstrate the advantages of the deep self-attention model for producing a potential map linked to gold mineralization in the Suizao district, Hubei Province, China. The results show that the delineated high potential area for gold mineralization has a close spatial association with known mineral deposits and ore-controlling geological factors, suggesting a robust predictive model with an accuracy of 0.88. The comparative experiments demonstrated the effectiveness of the self-attention mechanism and the optimum depth of the deep self-attention model. The targeted areas delineated in this study can guide gold mineral exploration in the future.
引用
收藏
页码:37 / 56
页数:20
相关论文
共 50 条
  • [31] Improved mineral prospectivity mapping using graph neural networks
    Sihombing, Felix M. H.
    Palin, Richard M.
    Hughes, Hannah S. R.
    Robb, Laurence J.
    ORE GEOLOGY REVIEWS, 2024, 172
  • [32] Mapping mineral prospectivity through big data analytics and a deep learning algorithm
    Xiong, Yihui
    Zuo, Renguang
    Carranza, Emmanuel John M.
    ORE GEOLOGY REVIEWS, 2018, 102 : 811 - 817
  • [33] Deep ConvLSTM With Self-Attention for Human Activity Decoding Using Wearable Sensors
    Singh, Satya P.
    Sharma, Madan Kumar
    Lay-Ekuakille, Aime
    Gangwar, Deepak
    Gupta, Sukrit
    IEEE SENSORS JOURNAL, 2021, 21 (06) : 8575 - 8582
  • [34] Self-attention Adversarial Based Deep Subspace Clustering
    Yin M.
    Wu H.-Y.
    Xie S.-L.
    Yang Q.-Y.
    Zidonghua Xuebao/Acta Automatica Sinica, 2022, 48 (01): : 271 - 281
  • [35] Reconstructing computational spectra using deep learning’s self-attention method
    Wu, Hao
    Wu, Hui
    Su, Xinyu
    Wu, Jingjun
    Liu, Shuangli
    Optica Applicata, 2024, 54 (03) : 383 - 394
  • [36] SELF-ATTENTION GUIDED DEEP FEATURES FOR ACTION RECOGNITION
    Xiao, Renyi
    Hou, Yonghong
    Guo, Zihui
    Li, Chuankun
    Wang, Pichao
    Li, Wanqing
    2019 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO (ICME), 2019, : 1060 - 1065
  • [37] MetaTransformer: deep metagenomic sequencing read classification using self-attention models
    Wichmann, Alexander
    Buschong, Etienne
    Mueller, Andre
    Juenger, Daniel
    Hildebrandt, Andreas
    Hankeln, Thomas
    Schmidt, Bertil
    NAR GENOMICS AND BIOINFORMATICS, 2023, 5 (03)
  • [38] Deep Learning Approach to Impact Classification in Sensorized Panels Using Self-Attention
    Karmakov, Stefan
    Aliabadi, M. H. Ferri
    SENSORS, 2022, 22 (12)
  • [39] Sentence Matching with Deep Self-attention and Co-attention Features
    Wang, Zhipeng
    Yan, Danfeng
    KNOWLEDGE SCIENCE, ENGINEERING AND MANAGEMENT, KSEM 2021, PT II, 2021, 12816 : 550 - 561
  • [40] Homogeneous Learning: Self-Attention Decentralized Deep Learning
    Sun, Yuwei
    Ochiai, Hideya
    IEEE ACCESS, 2022, 10 : 7695 - 7703