Evaluation of a transformer-based model for the temporal forecast of coarse particulate matter (PMCO) concentrations

被引:0
|
作者
Mauricio-Alvarez, Luis Eduardo [1 ]
Aceves-Fernandez, Marco Antonio [1 ]
Pedraza-Ortega, Jesus Carlos [1 ]
Ramos-Arreguin, Juan Manuel [1 ]
机构
[1] Autonomous Univ Queretaro, Fac Engn, Cerro Campanas, Queretaro 76010, Queretaro, Mexico
关键词
Deep learning; Forecasting; Transformer; Air pollution; PMCO; NEURAL-NETWORK;
D O I
10.1007/s12145-024-01330-6
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Accurate forecasting of coarse particulate matter (PMCO) concentrations is crucial for mitigating health risks and environmental impacts in urban areas. This study evaluates the performance of a transformer-based deep learning model for predicting PMCO levels using 2022 data from four monitoring stations (BJU, MER, TLA, UIZ) in Mexico City. The transformer model's forecasting accuracy is assessed for horizons of 12, 24, 48, and 72 hours ahead and compared against conventional autoregressive integrated moving average (ARIMA) and long short-term memory (LSTM) models. Error metrics including root mean square error (RMSE), mean absolute error (MAE), and mean absolute percentage error (MAPE) are employed for evaluation. Results demonstrate the transformer model's superior performance, achieving the lowest error values across multiple stations and prediction horizons. However, challenges are identified for short-term forecasts and sites near industrial areas with high PMCO variability. The study highlights the transformer model's potential for accurate PMCO forecasting while underscoring the need for interdisciplinary approaches to address complex air pollution dynamics in urban environments.
引用
收藏
页码:3095 / 3110
页数:16
相关论文
共 50 条
  • [31] Evaluation of reinforcement learning in transformer-based molecular design
    He, Jiazhen
    Tibo, Alessandro
    Janet, Jon Paul
    Nittinger, Eva
    Tyrchan, Christian
    Czechtizky, Werngard
    Engkvist, Ola
    JOURNAL OF CHEMINFORMATICS, 2024, 16 (01):
  • [32] Spatial and temporal variability of outdoor coarse particulate matter mass concentrations measured with a new coarse particle sampler during the Detroit Exposure and Aerosol Research Study
    Thornburg, Jonathan
    Rodes, Charles E.
    Lawless, Phillip A.
    Williams, Ron
    ATMOSPHERIC ENVIRONMENT, 2009, 43 (28) : 4251 - 4258
  • [33] Towards an astronomical foundation model for stars with a transformer-based model
    Leung, Henry W.
    Bovy, Jo
    MONTHLY NOTICES OF THE ROYAL ASTRONOMICAL SOCIETY, 2024, 527 (01) : 1494 - 1520
  • [34] Prediction of fine particulate matter concentrations based on generalized hidden Markov model
    Zhang H.
    Yu J.
    Liu X.
    Lei H.
    Zhang, Hao (haozhang@swu.edu.cn), 2018, Materials China (69): : 1215 - 1220
  • [35] Transformer-Based Unified Neural Network for Quality Estimation and Transformer-Based Re-decoding Model for Machine Translation
    Chen, Cong
    Zong, Qinqin
    Luo, Qi
    Qiu, Bailian
    Li, Maoxi
    MACHINE TRANSLATION, CCMT 2020, 2020, 1328 : 66 - 75
  • [36] A Swin Transformer-based model for mosquito species identification
    Zhao, De-zhong
    Wang, Xin-kai
    Zhao, Teng
    Li, Hu
    Xing, Dan
    Gao, He-ting
    Song, Fan
    Chen, Guo-hua
    Li, Chun-xiao
    SCIENTIFIC REPORTS, 2022, 12 (01)
  • [37] A Swin Transformer-based model for mosquito species identification
    De-zhong Zhao
    Xin-kai Wang
    Teng Zhao
    Hu Li
    Dan Xing
    He-ting Gao
    Fan Song
    Guo-hua Chen
    Chun-xiao Li
    Scientific Reports, 12
  • [38] AN EFFICIENT TRANSFORMER-BASED MODEL FOR VOICE ACTIVITY DETECTION
    Zhao, Yifei
    Champagne, Benoit
    2022 IEEE 32ND INTERNATIONAL WORKSHOP ON MACHINE LEARNING FOR SIGNAL PROCESSING (MLSP), 2022,
  • [39] DLGNet: A Transformer-based Model for Dialogue Response Generation
    Olabiyi, Oluwatobi
    Mueller, Erik T.
    NLP FOR CONVERSATIONAL AI, 2020, : 54 - 62
  • [40] TRANSQL: A Transformer-based Model for Classifying SQL Queries
    Tahmasebi, Shirin
    Payberah, Amir H.
    Paragraph, Ahmet Soylu
    Roman, Dumitru
    Matskin, Mihhail
    2022 21ST IEEE INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND APPLICATIONS, ICMLA, 2022, : 788 - 793