Exploiting the Self-Attention Mechanism in Gas Sensor Array (GSA) Data With Neural Networks

被引:2
|
作者
Wang, Ningning [1 ]
Li, Silong [1 ]
Ye, Terry Tao [1 ]
机构
[1] Southern Univ Sci & Technol, Dept Elect & Elect Engn, Shenzhen 518055, Peoples R China
关键词
Sensors; Sensor arrays; Gases; Gas detectors; Quantization (signal); Feature extraction; Sensor phenomena and characterization; Gas classification; gas sensor array (GSA); long short-term memory (LSTM); self-attention mechanism; MACHINE OLFACTION; DISCRIMINATION;
D O I
10.1109/JSEN.2023.3240470
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Gas sensor array (GSA) data is a sequential series of values that represents the temporal conditions of the existence/absence/mixture of gases and exhibits similarities to the textual stream of natural languages that represents semantic information. We speculate and subsequently prove that there also exist self-attention mechanisms in GSA data that can be exploited for gas classification and recognition. We first convert GSA data into a 1-D token series (called WORDs in this work) through sampling and quantization of the sensor values and then use an enhanced long short-term memory (LSTM) revision network, called LSTM-attention, to extract the self-attention mechanism in the GSA data. We demonstrate that LSTM-attention achieves a much better performance (99.6%) than CNN-based networks as well as other GSA data process techniques on UCI dynamic gases dataset. We also find out that the self-attention mechanism varies with different sampling and quantization levels during data acquisition.
引用
收藏
页码:5988 / 5996
页数:9
相关论文
共 50 条
  • [21] Multiple Protein Subcellular Locations Prediction Based on Deep Convolutional Neural Networks with Self-Attention Mechanism
    Cong, Hanhan
    Liu, Hong
    Cao, Yi
    Chen, Yuehui
    Liang, Cheng
    INTERDISCIPLINARY SCIENCES-COMPUTATIONAL LIFE SCIENCES, 2022, 14 (02) : 421 - 438
  • [22] Multiple Protein Subcellular Locations Prediction Based on Deep Convolutional Neural Networks with Self-Attention Mechanism
    Hanhan Cong
    Hong Liu
    Yi Cao
    Yuehui Chen
    Cheng Liang
    Interdisciplinary Sciences: Computational Life Sciences, 2022, 14 : 421 - 438
  • [23] Siamese Recurrent Neural Network with a Self-Attention Mechanism for Bioactivity Prediction
    Fernandez-Llaneza, Daniel
    Ulander, Silas
    Gogishvili, Dea
    Nittinger, Eva
    Zhao, Hongtao
    Tyrchan, Christian
    ACS OMEGA, 2021, 6 (16): : 11086 - 11094
  • [24] Probabilistic Matrix Factorization Recommendation of Self-Attention Mechanism Convolutional Neural Networks With Item Auxiliary Information
    Zhang, Chenkun
    Wang, Cheng
    IEEE ACCESS, 2020, 8 (08): : 208311 - 208321
  • [25] Lipschitz Normalization for Self-Attention Layers with Application to Graph Neural Networks
    Dasoulas, George
    Scaman, Kevin
    Virmaux, Aladin
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139
  • [26] Global Convolutional Neural Networks With Self-Attention for Fisheye Image Rectification
    Kim, Byunghyun
    Lee, Dohyun
    Min, Kyeongyuk
    Chong, Jongwha
    Joe, Inwhee
    IEEE ACCESS, 2022, 10 : 129580 - 129587
  • [27] Sparse self-attention aggregation networks for neural sequence slice interpolation
    Zejin Wang
    Jing Liu
    Xi Chen
    Guoqing Li
    Hua Han
    BioData Mining, 14
  • [28] ARCHITECTURE SELF-ATTENTION MECHANISM: NONLINEAR OPTIMIZATION FOR NEURAL ARCHITECTURE SEARCH
    Hao, Jie
    Zhu, William
    JOURNAL OF NONLINEAR AND VARIATIONAL ANALYSIS, 2021, 5 (01): : 119 - 140
  • [29] Combining convolutional neural networks and self-attention for fundus diseases identification
    Wang, Keya
    Xu, Chuanyun
    Li, Gang
    Zhang, Yang
    Zheng, Yu
    Sun, Chengjie
    SCIENTIFIC REPORTS, 2023, 13 (01)
  • [30] Original Music Generation using Recurrent Neural Networks with Self-Attention
    Jagannathan, Akash
    Chandrasekaran, Bharathi
    Dutta, Shubham
    Patil, Uma Rameshgouda
    Eirinaki, Magdalini
    2022 FOURTH IEEE INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE TESTING (AITEST 2022), 2022, : 56 - 63