Mineral Prospectivity Mapping Using Deep Self-Attention Model

被引:28
|
作者
Yin, Bojun [1 ]
Zuo, Renguang [1 ]
Sun, Siquan [2 ]
机构
[1] China Univ Geosci, State Key Lab Geol Proc & Mineral Resources, Wuhan 430074, Peoples R China
[2] Hubei Geol Survey, Wuhan 430034, Peoples R China
基金
中国国家自然科学基金;
关键词
Mineral prospectivity mapping; Deep learning; Self-attention mechanism; Gold mineralization; KNOWLEDGE-DRIVEN METHOD; SPATIAL EVIDENCE; GREENSTONE-BELT; BAGUIO DISTRICT; REPRESENTATION;
D O I
10.1007/s11053-022-10142-8
中图分类号
P [天文学、地球科学];
学科分类号
07 ;
摘要
Multi-source data integration for mineral prospectivity mapping (MPM) is an effective approach for reducing uncertainty and improving MPM accuracy. Multi-source data (e.g., geological, geophysical, geochemical, remote sensing, and drilling) should first be identified as evidence layers that represent ore-prospecting-related features. Traditional methods for MPM often neglect the correlations between different evidence layers that vary with their spatial locations, which results in the loss of useful information when integrating them into a mineral potential map. In this study, a deep self-attention model was adopted to integrate multiple evidence layers supported by a self-attention mechanism that can capture the internal relationships between various evidence layers and consider the spatial heterogeneity simultaneously. The attention matrix of the self-attention mechanism was further visualized to improve the interpretability of the proposed deep neural network model. A case study was conducted to demonstrate the advantages of the deep self-attention model for producing a potential map linked to gold mineralization in the Suizao district, Hubei Province, China. The results show that the delineated high potential area for gold mineralization has a close spatial association with known mineral deposits and ore-controlling geological factors, suggesting a robust predictive model with an accuracy of 0.88. The comparative experiments demonstrated the effectiveness of the self-attention mechanism and the optimum depth of the deep self-attention model. The targeted areas delineated in this study can guide gold mineral exploration in the future.
引用
收藏
页码:37 / 56
页数:20
相关论文
共 50 条
  • [41] ENHANCING NATURAL LANGUAGE PROCESSING USING WALRUS OPTIMIZER WITH SELF-ATTENTION DEEP LEARNING MODEL IN APPLIED LINGUISTICS
    Hassan, Abdulkhaleq Q. A.
    Alshammari, Alya
    Zaqaibeh, Belal
    Alzaidi, Muhammad Swaileh a.
    Allafi, Randa
    Alazwari, Sana
    Aljabri, Jawhara
    Nouri, Amal M.
    FRACTALS-COMPLEX GEOMETRY PATTERNS AND SCALING IN NATURE AND SOCIETY, 2024,
  • [42] COVAD: Content-Oriented Video Anomaly Detection using a Self-Attention based Deep Learning Model
    Shao W.
    Rajapaksha P.
    Wei Y.
    Li D.
    Crespi N.
    Luo Z.
    Virtual Reality and Intelligent Hardware, 2023, 5 (01): : 24 - 41
  • [43] Deep Multiscale Convolutional Model With Multihead Self-Attention for Industrial Process Fault Diagnosis
    Chen, Youqiang
    Zhang, Ridong
    IEEE TRANSACTIONS ON SYSTEMS MAN CYBERNETICS-SYSTEMS, 2025,
  • [44] SAGSleepNet: A deep learning model for sleep staging based on self-attention graph of polysomnography
    Jin, Zheng
    Jia, Kebin
    BIOMEDICAL SIGNAL PROCESSING AND CONTROL, 2023, 86
  • [45] A Cross-Project Defect Prediction Model Based on Deep Learning With Self-Attention
    Wen, Wanzhi
    Zhang, Ruinian
    Wang, Chuyue
    Shen, Chenqiang
    Yu, Meng
    Zhang, Suchuan
    Gao, Xinxin
    IEEE ACCESS, 2022, 10 : 110385 - 110401
  • [46] SABDM: A self-attention based bidirectional-RNN deep model for requirements classification
    Kaur, Kamaljit
    Kaur, Parminder
    JOURNAL OF SOFTWARE-EVOLUTION AND PROCESS, 2024, 36 (02)
  • [47] Deep Bug Triage Model Based on Multi-head Self-attention Mechanism
    Yu, Xu
    Wan, Fayang
    Tang, Bin
    Zhan, Dingjia
    Peng, Qinglong
    Yu, Miao
    Wang, Zhaozhe
    Cui, Shuang
    COMPUTER SUPPORTED COOPERATIVE WORK AND SOCIAL COMPUTING, CHINESECSCW 2021, PT II, 2022, 1492 : 107 - 119
  • [48] A deep learning sequence model based on self-attention and convolution for wind power prediction
    Liu, Chien-Liang
    Chang, Tzu-Yu
    Yang, Jie-Si
    Huang, Kai-Bin
    RENEWABLE ENERGY, 2023, 219
  • [49] SELF-ATTENTION ALIGNER: A LATENCY-CONTROL END-TO-END MODEL FOR ASR USING SELF-ATTENTION NETWORK AND CHUNK-HOPPING
    Dong, Linhao
    Wang, Feng
    Xu, Bo
    2019 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2019, : 5656 - 5660
  • [50] Mineral Prospectivity Mapping Using Semi-supervised Machine Learning
    Li, Quanke
    Chen, Guoxiong
    Wang, Detao
    MATHEMATICAL GEOSCIENCES, 2025, 57 (02) : 275 - 305