Evaluation of a transformer-based model for the temporal forecast of coarse particulate matter (PMCO) concentrations

被引:0
|
作者
Mauricio-Alvarez, Luis Eduardo [1 ]
Aceves-Fernandez, Marco Antonio [1 ]
Pedraza-Ortega, Jesus Carlos [1 ]
Ramos-Arreguin, Juan Manuel [1 ]
机构
[1] Autonomous Univ Queretaro, Fac Engn, Cerro Campanas, Queretaro 76010, Queretaro, Mexico
关键词
Deep learning; Forecasting; Transformer; Air pollution; PMCO; NEURAL-NETWORK;
D O I
10.1007/s12145-024-01330-6
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Accurate forecasting of coarse particulate matter (PMCO) concentrations is crucial for mitigating health risks and environmental impacts in urban areas. This study evaluates the performance of a transformer-based deep learning model for predicting PMCO levels using 2022 data from four monitoring stations (BJU, MER, TLA, UIZ) in Mexico City. The transformer model's forecasting accuracy is assessed for horizons of 12, 24, 48, and 72 hours ahead and compared against conventional autoregressive integrated moving average (ARIMA) and long short-term memory (LSTM) models. Error metrics including root mean square error (RMSE), mean absolute error (MAE), and mean absolute percentage error (MAPE) are employed for evaluation. Results demonstrate the transformer model's superior performance, achieving the lowest error values across multiple stations and prediction horizons. However, challenges are identified for short-term forecasts and sites near industrial areas with high PMCO variability. The study highlights the transformer model's potential for accurate PMCO forecasting while underscoring the need for interdisciplinary approaches to address complex air pollution dynamics in urban environments.
引用
收藏
页码:3095 / 3110
页数:16
相关论文
共 50 条
  • [41] A Novel Transformer-Based Model for Dialog State Tracking
    Miao, Yu
    Liu, Kuilong
    Yang, Wenbo
    Yang, Changyuan
    CROSS-CULTURAL DESIGN-APPLICATIONS IN BUSINESS, COMMUNICATION, HEALTH, WELL-BEING, AND INCLUSIVENESS, CCD 2022, PT III, 2022, 13313 : 148 - 156
  • [42] An ensemble transformer-based model for Arabic sentiment analysis
    Mohamed, Omar
    Kassem, Aly M. M.
    Ashraf, Ali
    Jamal, Salma
    Mohamed, Ensaf Hussein
    SOCIAL NETWORK ANALYSIS AND MINING, 2022, 13 (01)
  • [43] Learning Daily Human Mobility with a Transformer-Based Model
    Wang, Weiying
    Osaragi, Toshihiro
    ISPRS INTERNATIONAL JOURNAL OF GEO-INFORMATION, 2024, 13 (02)
  • [44] A Transformer-based Audio Captioning Model with Keyword Estimation
    Koizumi, Yuma
    Masumura, Ryo
    Nishida, Kyosuke
    Yasuda, Masahiro
    Saito, Shoichiro
    INTERSPEECH 2020, 2020, : 1977 - 1981
  • [45] Transformer-based heart language model with electrocardiogram annotations
    Stojancho Tudjarski
    Marjan Gusev
    Evangelos Kanoulas
    Scientific Reports, 15 (1)
  • [46] An Improved Transformer-Based Model for Urban Pedestrian Detection
    Tianyong Wu
    Xiang Li
    Qiuxuan Dong
    International Journal of Computational Intelligence Systems, 18 (1)
  • [47] LVBERT: Transformer-Based Model for Latvian Language Understanding
    Znotins, Arturs
    Barzdins, Guntis
    HUMAN LANGUAGE TECHNOLOGIES - THE BALTIC PERSPECTIVE (HLT 2020), 2020, 328 : 111 - 115
  • [48] A Transformer-based Embedding Model for Personalized Product Search
    Bi, Keping
    Ai, Qingyao
    Croft, W. Bruce
    PROCEEDINGS OF THE 43RD INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL (SIGIR '20), 2020, : 1521 - 1524
  • [49] Predicting the formation of NADES using a transformer-based model
    Ayres, Lucas B.
    Gomez, Federico J. V.
    Silva, Maria Fernanda
    Linton, Jeb R.
    Garcia, Carlos D.
    SCIENTIFIC REPORTS, 2024, 14 (01)
  • [50] A Transformer-based Medical Visual Question Answering Model
    Liu, Lei
    Su, Xiangdong
    Guo, Hui
    Zhu, Daobin
    2022 26TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2022, : 1712 - 1718