Exploiting the Self-Attention Mechanism in Gas Sensor Array (GSA) Data With Neural Networks

被引:2
|
作者
Wang, Ningning [1 ]
Li, Silong [1 ]
Ye, Terry Tao [1 ]
机构
[1] Southern Univ Sci & Technol, Dept Elect & Elect Engn, Shenzhen 518055, Peoples R China
关键词
Sensors; Sensor arrays; Gases; Gas detectors; Quantization (signal); Feature extraction; Sensor phenomena and characterization; Gas classification; gas sensor array (GSA); long short-term memory (LSTM); self-attention mechanism; MACHINE OLFACTION; DISCRIMINATION;
D O I
10.1109/JSEN.2023.3240470
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Gas sensor array (GSA) data is a sequential series of values that represents the temporal conditions of the existence/absence/mixture of gases and exhibits similarities to the textual stream of natural languages that represents semantic information. We speculate and subsequently prove that there also exist self-attention mechanisms in GSA data that can be exploited for gas classification and recognition. We first convert GSA data into a 1-D token series (called WORDs in this work) through sampling and quantization of the sensor values and then use an enhanced long short-term memory (LSTM) revision network, called LSTM-attention, to extract the self-attention mechanism in the GSA data. We demonstrate that LSTM-attention achieves a much better performance (99.6%) than CNN-based networks as well as other GSA data process techniques on UCI dynamic gases dataset. We also find out that the self-attention mechanism varies with different sampling and quantization levels during data acquisition.
引用
收藏
页码:5988 / 5996
页数:9
相关论文
共 50 条
  • [41] Leveraging contextual embeddings and self-attention neural networks with bi-attention for sentiment analysis
    Magdalena Biesialska
    Katarzyna Biesialska
    Henryk Rybinski
    Journal of Intelligent Information Systems, 2021, 57 : 601 - 626
  • [42] ELSA: Hardware-Software Co-design for Efficient, Lightweight Self-Attention Mechanism in Neural Networks
    Ham, Tae Jun
    Lee, Yejin
    Seo, Seong Hoon
    Kim, Soosung
    Choi, Hyunji
    Jung, Sung Jun
    Lee, Jae W.
    2021 ACM/IEEE 48TH ANNUAL INTERNATIONAL SYMPOSIUM ON COMPUTER ARCHITECTURE (ISCA 2021), 2021, : 692 - 705
  • [43] Self-Attention Networks For Motion Posture Recognition Based On Data Fusion
    Ji, Zhihao
    Xie, Qiang
    4TH INTERNATIONAL CONFERENCE ON INFORMATICS ENGINEERING AND INFORMATION SCIENCE (ICIEIS2021), 2022, 12161
  • [44] Leveraging contextual embeddings and self-attention neural networks with bi-attention for sentiment analysis
    Biesialska, Magdalena
    Biesialska, Katarzyna
    Rybinski, Henryk
    JOURNAL OF INTELLIGENT INFORMATION SYSTEMS, 2021, 57 (03) : 601 - 626
  • [45] Prediction of Sea Surface Temperature by Combining Interdimensional and Self-Attention with Neural Networks
    Guo, Xing
    He, Jianghai
    Wang, Biao
    Wu, Jiaji
    REMOTE SENSING, 2022, 14 (19)
  • [46] Generating self-attention activation maps for visual interpretations of convolutional neural networks
    Liang, Yu
    Li, Maozhen
    Jiang, Changjun
    NEUROCOMPUTING, 2022, 490 : 206 - 216
  • [47] Recurrent Neural Network Model with Self-Attention Mechanism for Fault Detection and Diagnosis
    Zhang, Rui
    Xiong, Zhihua
    2019 CHINESE AUTOMATION CONGRESS (CAC2019), 2019, : 4706 - 4711
  • [48] Enabling Energy-Efficient Inference for Self-Attention Mechanisms in Neural Networks
    Chen, Qinyu
    Sun, Congyi
    Lu, Zhonghai
    Gao, Chang
    2022 IEEE INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE CIRCUITS AND SYSTEMS (AICAS 2022): INTELLIGENT TECHNOLOGY IN THE POST-PANDEMIC ERA, 2022, : 25 - 28
  • [49] Rapid nuclide identification algorithm based on self-attention mechanism neural network
    Sun, Jiaqian
    Niu, Deqing
    Liang, Jie
    Hou, Xin
    Li, Linshan
    ANNALS OF NUCLEAR ENERGY, 2024, 207
  • [50] Automatic Lyrics Transcription using Dilated Convolutional Neural Networks with Self-Attention
    Demirel, Emir
    Ahlback, Sven
    Dixon, Simon
    2020 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2020,