Exploiting the Self-Attention Mechanism in Gas Sensor Array (GSA) Data With Neural Networks

被引:2
|
作者
Wang, Ningning [1 ]
Li, Silong [1 ]
Ye, Terry Tao [1 ]
机构
[1] Southern Univ Sci & Technol, Dept Elect & Elect Engn, Shenzhen 518055, Peoples R China
关键词
Sensors; Sensor arrays; Gases; Gas detectors; Quantization (signal); Feature extraction; Sensor phenomena and characterization; Gas classification; gas sensor array (GSA); long short-term memory (LSTM); self-attention mechanism; MACHINE OLFACTION; DISCRIMINATION;
D O I
10.1109/JSEN.2023.3240470
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Gas sensor array (GSA) data is a sequential series of values that represents the temporal conditions of the existence/absence/mixture of gases and exhibits similarities to the textual stream of natural languages that represents semantic information. We speculate and subsequently prove that there also exist self-attention mechanisms in GSA data that can be exploited for gas classification and recognition. We first convert GSA data into a 1-D token series (called WORDs in this work) through sampling and quantization of the sensor values and then use an enhanced long short-term memory (LSTM) revision network, called LSTM-attention, to extract the self-attention mechanism in the GSA data. We demonstrate that LSTM-attention achieves a much better performance (99.6%) than CNN-based networks as well as other GSA data process techniques on UCI dynamic gases dataset. We also find out that the self-attention mechanism varies with different sampling and quantization levels during data acquisition.
引用
收藏
页码:5988 / 5996
页数:9
相关论文
共 50 条
  • [31] Spatial-Temporal Self-Attention for Asynchronous Spiking Neural Networks
    Wang, Yuchen
    Shi, Kexin
    Lu, Chengzhuo
    Liu, Yuguo
    Zhang, Malu
    Qu, Hong
    PROCEEDINGS OF THE THIRTY-SECOND INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2023, 2023, : 3085 - 3093
  • [32] Sparse self-attention aggregation networks for neural sequence slice interpolation
    Wang, Zejin
    Liu, Jing
    Chen, Xi
    Li, Guoqing
    Han, Hua
    BIODATA MINING, 2021, 14 (01)
  • [33] Combining convolutional neural networks and self-attention for fundus diseases identification
    Keya Wang
    Chuanyun Xu
    Gang Li
    Yang Zhang
    Yu Zheng
    Chengjie Sun
    Scientific Reports, 13
  • [34] Global Convolutional Neural Networks With Self-Attention for Fisheye Image Rectification
    Kim, Byunghyun
    Lee, Dohyun
    Min, Kyeongyuk
    Chong, Jongwha
    Joe, Inwhee
    IEEE Access, 2022, 10 : 129580 - 129587
  • [35] Temporal link prediction in directed networks based on self-attention mechanism
    Li, Jinsong
    Peng, Jianhua
    Liu, Shuxin
    Weng, Lintianran
    Li, Cong
    INTELLIGENT DATA ANALYSIS, 2022, 26 (01) : 173 - 188
  • [36] Self-Attention Generative Adversarial Networks
    Zhang, Han
    Goodfellow, Ian
    Metaxas, Dimitris
    Odena, Augustus
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 97, 2019, 97
  • [37] Self-Attention Networks for Code Search
    Fang, Sen
    Tan, You-Shuai
    Zhang, Tao
    Liu, Yepang
    INFORMATION AND SOFTWARE TECHNOLOGY, 2021, 134
  • [38] Modeling Localness for Self-Attention Networks
    Yang, Baosong
    Tu, Zhaopeng
    Wong, Derek F.
    Meng, Fandong
    Chao, Lidia S.
    Zhang, Tong
    2018 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2018), 2018, : 4449 - 4458
  • [39] Graph convolutional networks with the self-attention mechanism for adaptive influence maximization in social networks
    Tang, Jianxin
    Song, Shihui
    Du, Qian
    Yao, Yabing
    Qu, Jitao
    COMPLEX & INTELLIGENT SYSTEMS, 2024, : 8383 - 8401
  • [40] Magnetotelluric Data Inversion Based on Deep Learning With the Self-Attention Mechanism
    Xu, Kaijun
    Liang, Shuyuan
    Lu, Yan
    Hu, Zuzhi
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2024, 62