A Lithological Classification Model Based on Fourier Neural Operators and Channel-Wise Self-Attention

被引:0
|
作者
Marques Jr, Ademir [1 ]
Silva, Luiz Jose Schirmer [1 ]
Cagliari, Joice [1 ,2 ]
Scalco, Leonardo [1 ]
da Silva, Luiza Carine Ferreira [1 ]
Veronez, Mauricio Roberto [1 ]
Gonzaga Jr, Luiz [1 ]
机构
[1] Unisinos Univ, Ctr Excellence Geoinformat & Visual Comp, VizLab, BR-93020 Sao Leopoldo, RS, Brazil
[2] Unisinos Univ, Grad Program Geol, BR-93020 Sao Leopoldo, RS, Brazil
关键词
Attention mechanisms; convolutional neural networks (CNNs); deep learning (DL); Fourier transform; geology; image classification; rocks; stratigraphy; transfer learning; ARARIPE BASIN;
D O I
10.1109/LGRS.2024.3438547
中图分类号
P3 [地球物理学]; P59 [地球化学];
学科分类号
0708 ; 070902 ;
摘要
Lithological characterization plays a crucial role in geological studies, and outcrops serve as the primary source of geological information. Automatic identification of lithologies enhances geological mapping and reduces costs and risks associated with mapping less accessible outcrops. These outcrops are typically imaged using remote sensing techniques, enabling the identification of geological structures and lithologies through computer vision and machine learning (ML) approaches. In this context, convolutional neural networks (CNNs) have significantly contributed to lithological characterization in outcrop images. Recent advancements include novel architectures based on residual and attention blocks, which improve upon base CNN models. In addition, transformer-based architectures have surpassed CNNs in various tasks. Taking a step further, we propose a novel architecture that incorporates Fourier operators. Our proposed architecture builds upon the Transformer model, utilizing a sequential combination of Fourier neural operators (FNOs) and channelwise self-attention layers. To train our model, we adopt a transfer learning strategy, initially training it on a texture dataset with 47 classes. Subsequently, we fine-tune the same model to classify five specific lithologies in our custom dataset. These lithologies include sandstone, gray and brownish-gray shale, limestone, and laminated limestone images from the Tres Irm & atilde;os quarry within the Araripe Basin-an outcrop analogous to oil exploration reservoirs. The proposed architecture achieved an F1-score of up to 98%, performing better than reference CNN and ResNet models. This advancement holds promise for accurate lithological characterization, benefiting geological research and exploration efforts.
引用
收藏
页数:5
相关论文
共 50 条
  • [31] A Self-attention Based LSTM Network for Text Classification
    Jing, Ran
    2019 3RD INTERNATIONAL CONFERENCE ON CONTROL ENGINEERING AND ARTIFICIAL INTELLIGENCE (CCEAI 2019), 2019, 1207
  • [32] Web service classification based on self-attention mechanism
    Jia, Zhichun
    Zhang, Zhiying
    Dong, Rui
    Yang, Zhongxuan
    Xing, Xing
    2023 35TH CHINESE CONTROL AND DECISION CONFERENCE, CCDC, 2023, : 2164 - 2169
  • [33] Fourier or Wavelet bases as counterpart self-attention in spikformer for efficient visual classification
    Wang, Qingyu
    Zhang, Duzhen
    Cai, Xinyuan
    Zhang, Tielin
    Xu, Bo
    FRONTIERS IN NEUROSCIENCE, 2025, 18
  • [34] Self-attention Based Collaborative Neural Network for Recommendation
    Ma, Shengchao
    Zhu, Jinghua
    WIRELESS ALGORITHMS, SYSTEMS, AND APPLICATIONS, WASA 2019, 2019, 11604 : 235 - 246
  • [35] Adaptive Feature Self-Attention in Spiking Neural Networks for Hyperspectral Classification
    Li, Heng
    Tu, Bing
    Liu, Bo
    Li, Jun
    Plaza, Antonio
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2025, 63
  • [36] CERVICAL CELL CLASSIFICATION USING MULTI-SCALE FEATURE FUSION AND CHANNEL-WISE CROSS-ATTENTION
    Shi, Jun
    Zhu, Xinyu
    Zhang, Yuan
    Zheng, Yushan
    Jiang, Zhiguo
    Zheng, Liping
    2023 IEEE 20TH INTERNATIONAL SYMPOSIUM ON BIOMEDICAL IMAGING, ISBI, 2023,
  • [37] Image classification model based on large kernel attention mechanism and relative position self-attention mechanism
    Liu S.
    Wei J.
    Liu G.
    Zhou B.
    PeerJ Computer Science, 2023, 9
  • [38] Image classification model based on large kernel attention mechanism and relative position self-attention mechanism
    Liu, Siqi
    Wei, Jiangshu
    Liu, Gang
    Zhou, Bei
    PEERJ COMPUTER SCIENCE, 2023, 9
  • [39] Dynamic Structured Neural Topic Model with Self-Attention Mechanism
    Miyamoto, Nozomu
    Isonuma, Masaru
    Takase, Sho
    Mori, Junichiro
    Sakata, Ichiro
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL 2023, 2023, : 5916 - 5930
  • [40] Sample Expansion and Classification Model of Maize Leaf Diseases Based on the Self-Attention CycleGAN
    Guo, Hongliang
    Li, Mingyang
    Hou, Ruizheng
    Liu, Hanbo
    Zhou, Xudan
    Zhao, Chunli
    Chen, Xiao
    Gao, Lianxing
    SUSTAINABILITY, 2023, 15 (18)