QAR Data Imputation Using Generative Adversarial Network with Self-Attention Mechanism

被引:2
|
作者
Zhao, Jingqi [1 ]
Rong, Chuitian [1 ]
Dang, Xin [1 ]
Sun, Huabo [2 ]
机构
[1] Tiangong Univ, Sch Comp Sci & Technol, Tianjin 300387, Peoples R China
[2] China Acad Civil Aviat Sci & Technol, Inst Aviat Safety, Beijing 100028, Peoples R China
来源
BIG DATA MINING AND ANALYTICS | 2024年 / 7卷 / 01期
基金
中国国家自然科学基金;
关键词
multivariate time series; data imputation; self-attention; Generative Adversarial Network (GAN); MISSING DATA;
D O I
10.26599/BDMA.2023.9020001
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Quick Access Recorder (QAR), an important device for storing data from various flight parameters, contains a large amount of valuable data and comprehensively records the real state of the airline flight. However, the recorded data have certain missing values due to factors, such as weather and equipment anomalies. These missing values seriously affect the analysis of QAR data by aeronautical engineers, such as airline flight scenario reproduction and airline flight safety status assessment. Therefore, imputing missing values in the QAR data, which can further guarantee the flight safety of airlines, is crucial. QAR data also have multivariate, multiprocess, and temporal features. Therefore, we innovatively propose the imputation models A-AEGAN ("A" denotes attention mechanism, "AE" denotes autoencoder, and "GAN" denotes generative adversarial network) and SA-AEGAN ("SA" denotes self-attentive mechanism) for missing values of QAR data, which can be effectively applied to QAR data. Specifically, we apply an innovative generative adversarial network to impute missing values from QAR data. The improved gated recurrent unit is then introduced as the neural unit of GAN, which can successfully capture the temporal relationships in QAR data. In addition, we modify the basic structure of GAN by using an autoencoder as the generator and a recurrent neural network as the discriminator. The missing values in the QAR data are imputed by using the adversarial relationship between generator and discriminator. We introduce an attention mechanism in the autoencoder to further improve the capability of the proposed model to capture the features of QAR data. Attention mechanisms can maintain the correlation among QAR data and improve the capability of the model to impute missing data. Furthermore, we improve the proposed model by integrating a self-attention mechanism to further capture the relationship between different parameters within the QAR data. Experimental results on real datasets demonstrate that the model can reasonably impute the missing values in QAR data with excellent results.
引用
收藏
页码:12 / 28
页数:17
相关论文
共 50 条
  • [1] Dialogue Generation Using Self-Attention Generative Adversarial Network
    Hatua, Amartya
    Nguyen, Trung T.
    Sung, Andrew H.
    [J]. 2019 IEEE INTERNATIONAL CONFERENCE ON CONVERSATIONAL DATA & KNOWLEDGE ENGINEERING (CDKE), 2019, : 33 - 38
  • [2] Missing Data Repairs for Traffic Flow With Self-Attention Generative Adversarial Imputation Net
    Zhang, Weibin
    Zhang, Pulin
    Yu, Yinghao
    Li, Xiying
    Biancardo, Salvatore Antonio
    Zhang, Junyi
    [J]. IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, 2022, 23 (07) : 7919 - 7930
  • [3] SELF-ATTENTION GENERATIVE ADVERSARIAL NETWORK FOR SPEECH ENHANCEMENT
    Huy Phan
    Nguyen, Huy Le
    Chen, Oliver Y.
    Koch, Philipp
    Duong, Ngoc Q. K.
    McLoughlin, Ian
    Mertins, Alfred
    [J]. 2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021), 2021, : 7103 - 7107
  • [4] Self-attention generative adversarial network with the conditional constraint
    Jia, Yufeng
    Ma, Li
    [J]. Xi'an Dianzi Keji Daxue Xuebao/Journal of Xidian University, 2019, 46 (06): : 163 - 170
  • [5] Attribute Network Representation Learning Based on Generative Adversarial Network and Self-attention Mechanism
    Li, Shanshan
    Tang, Meiling
    Dong, Yingnan
    [J]. International Journal of Network Security, 2024, 26 (01) : 51 - 58
  • [6] Improving the Spatial Resolution of Solar Images Using Generative Adversarial Network and Self-attention Mechanism*
    Deng, Junlan
    Song, Wei
    Liu, Dan
    Li, Qin
    Lin, Ganghua
    Wang, Haimin
    [J]. ASTROPHYSICAL JOURNAL, 2021, 923 (01):
  • [7] Data augmentation for skin lesion using self-attention based progressive generative adversarial network
    Abdelhalim, Ibrahim Saad Aly
    Mohamed, Mamdouh Farouk
    Mahdy, Yousef Bassyouni
    [J]. EXPERT SYSTEMS WITH APPLICATIONS, 2021, 165
  • [8] Self-Attention Generative Adversarial Networks
    Zhang, Han
    Goodfellow, Ian
    Metaxas, Dimitris
    Odena, Augustus
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 97, 2019, 97
  • [9] Missing Data Imputation for Online Monitoring of Power Equipment Based on Self-attention Generative Adversarial Networks
    Zhou, Yuanxiang
    Lin, Menglong
    Chen, Jianning
    Bai, Zheng
    Chen, Ming
    [J]. Gaodianya Jishu/High Voltage Engineering, 2023, 49 (05): : 1795 - 1809
  • [10] SAPCGAN: Self-Attention based Generative Adversarial Network for Point Clouds
    Li, Yushi
    Baciu, George
    [J]. PROCEEDINGS OF 2020 IEEE 19TH INTERNATIONAL CONFERENCE ON COGNITIVE INFORMATICS & COGNITIVE COMPUTING (ICCI*CC 2020), 2020, : 52 - 59