CityTransformer: A Transformer-Based Model for Contaminant Dispersion Prediction in a Realistic Urban Area

被引:2
|
作者
Asahi, Yuuichi [1 ]
Onodera, Naoyuki [1 ]
Hasegawa, Yuta [1 ]
Shimokawabe, Takashi [2 ]
Shiba, Hayato [2 ]
Idomura, Yasuhiro [1 ]
机构
[1] Japan Atom Energy Agcy, Ctr Computat Sci & E Syst, Chiba 2770827, Japan
[2] Univ Tokyo, Informat Technol Ctr, Chiba 2770882, Japan
关键词
Deep learning; Graphics-processing-unit-based computing; Lattice Boltzmann method; Urban plume dispersion; LATTICE BOLTZMANN METHOD; LARGE-EDDY SIMULATION; PLUME DISPERSION; NEURAL-NETWORKS; FLOW; PARAMETRIZATION; TURBULENCE; CANOPY; CFD;
D O I
10.1007/s10546-022-00777-8
中图分类号
P4 [大气科学(气象学)];
学科分类号
0706 ; 070601 ;
摘要
We develop a Transformer-based deep learning model to predict the plume concentrations in the urban area in statistically stationary flow conditions under a stationary and homogeneous forcing. Our model has two distinct input layers: Transformer layers for sequential data and convolutional layers in convolutional neural networks for image-like data. Our model can predict the plume concentration from realistically available data such as the time series monitoring data at a few observation stations, and the building shapes and the source location. It is shown that the model can give reasonably accurate prediction in less than a second. It is also shown that exactly the same model can be applied to predict the source location and emission rate, which also gives reasonable prediction accuracy.
引用
下载
收藏
页码:659 / 692
页数:34
相关论文
共 50 条
  • [21] A transformer-based model for default prediction in mid-cap corporate markets
    Korangi, Kamesh
    Mues, Christophe
    Bravo, Cristian
    EUROPEAN JOURNAL OF OPERATIONAL RESEARCH, 2023, 308 (01) : 306 - 320
  • [22] Multi-Modal Pedestrian Crossing Intention Prediction with Transformer-Based Model
    Wang, Ting-Wei
    Lai, Shang-Hong
    APSIPA TRANSACTIONS ON SIGNAL AND INFORMATION PROCESSING, 2024, 13 (05)
  • [23] TransPTM: a transformer-based model for non-histone acetylation site prediction
    Meng, Lingkuan
    Chen, Xingjian
    Cheng, Ke
    Chen, Nanjun
    Zheng, Zetian
    Wang, Fuzhou
    Sun, Hongyan
    Wong, Ka-Chun
    BRIEFINGS IN BIOINFORMATICS, 2024, 25 (03)
  • [24] Robust Transformer-based model for spatiotemporal PM2.5 prediction in California
    Tong, Weitian
    Limperis, Jordan
    Hamza-Lup, Felix
    Xu, Yao
    Li, Lixin
    EARTH SCIENCE INFORMATICS, 2024, 17 (01) : 315 - 328
  • [25] Comprehensive Transformer-Based Model Architecture for Real-World Storm Prediction
    Lin, Fudong
    Yuan, Xu
    Zhang, Yihe
    Sigdel, Purushottam
    Chen, Li
    Peng, Lu
    Tzeng, Nian-Feng
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES: APPLIED DATA SCIENCE AND DEMO TRACK, ECML PKDD 2023, PT VII, 2023, 14175 : 54 - 71
  • [26] An Explainable Transformer-Based Deep Learning Model for the Prediction of Incident Heart Failure
    Rao, Shishir
    Li, Yikuan
    Ramakrishnan, Rema
    Hassaine, Abdelaali
    Canoy, Dexter
    Cleland, John
    Lukasiewicz, Thomas
    Salimi-Khorshidi, Gholamreza
    Rahimi, Kazem
    IEEE JOURNAL OF BIOMEDICAL AND HEALTH INFORMATICS, 2022, 26 (07) : 3362 - 3372
  • [27] Pedestrian Crossing Intention Prediction with Multi-Modal Transformer-Based Model
    Wang, Ting Wei
    Lai, Shang-Hong
    2023 ASIA PACIFIC SIGNAL AND INFORMATION PROCESSING ASSOCIATION ANNUAL SUMMIT AND CONFERENCE, APSIPA ASC, 2023, : 1349 - 1356
  • [28] Traffic Transformer: Transformer-based framework for temporal traffic accident prediction
    Al-Thani, Mansoor G.
    Sheng, Ziyu
    Cao, Yuting
    Yang, Yin
    AIMS MATHEMATICS, 2024, 9 (05): : 12610 - 12629
  • [29] Transformer and Graph Transformer-Based Prediction of Drug-Target Interactions
    Qian, Meiling
    Lu, Weizhong
    Zhang, Yu
    Liu, Junkai
    Wu, Hongjie
    Lu, Yaoyao
    Li, Haiou
    Fu, Qiming
    Shen, Jiyun
    Xiao, Yongbiao
    CURRENT BIOINFORMATICS, 2024, 19 (05) : 470 - 481
  • [30] MM-Transformer: A Transformer-Based Knowledge Graph Link Prediction Model That Fuses Multimodal Features
    Wang, Dongsheng
    Tang, Kangjie
    Zeng, Jun
    Pan, Yue
    Dai, Yun
    Li, Huige
    Han, Bin
    SYMMETRY-BASEL, 2024, 16 (08):