CityTransformer: A Transformer-Based Model for Contaminant Dispersion Prediction in a Realistic Urban Area

被引:2
|
作者
Asahi, Yuuichi [1 ]
Onodera, Naoyuki [1 ]
Hasegawa, Yuta [1 ]
Shimokawabe, Takashi [2 ]
Shiba, Hayato [2 ]
Idomura, Yasuhiro [1 ]
机构
[1] Japan Atom Energy Agcy, Ctr Computat Sci & E Syst, Chiba 2770827, Japan
[2] Univ Tokyo, Informat Technol Ctr, Chiba 2770882, Japan
关键词
Deep learning; Graphics-processing-unit-based computing; Lattice Boltzmann method; Urban plume dispersion; LATTICE BOLTZMANN METHOD; LARGE-EDDY SIMULATION; PLUME DISPERSION; NEURAL-NETWORKS; FLOW; PARAMETRIZATION; TURBULENCE; CANOPY; CFD;
D O I
10.1007/s10546-022-00777-8
中图分类号
P4 [大气科学(气象学)];
学科分类号
0706 ; 070601 ;
摘要
We develop a Transformer-based deep learning model to predict the plume concentrations in the urban area in statistically stationary flow conditions under a stationary and homogeneous forcing. Our model has two distinct input layers: Transformer layers for sequential data and convolutional layers in convolutional neural networks for image-like data. Our model can predict the plume concentration from realistically available data such as the time series monitoring data at a few observation stations, and the building shapes and the source location. It is shown that the model can give reasonably accurate prediction in less than a second. It is also shown that exactly the same model can be applied to predict the source location and emission rate, which also gives reasonable prediction accuracy.
引用
下载
收藏
页码:659 / 692
页数:34
相关论文
共 50 条
  • [41] SIT: A Spatial Interaction-Aware Transformer-Based Model for Freeway Trajectory Prediction
    Li, Xiaolong
    Xia, Jing
    Chen, Xiaoyong
    Tan, Yongbin
    Chen, Jing
    ISPRS INTERNATIONAL JOURNAL OF GEO-INFORMATION, 2022, 11 (02)
  • [42] Transformer-based settlement prediction model of pile composite foundation under embankment loading
    Gao, Song
    Chen, Changfu
    Jiang, Xueqin
    Zhu, Shimin
    Cai, Huan
    Li, Wei
    COMPUTERS AND GEOTECHNICS, 2024, 176
  • [43] Deep-ProBind: binding protein prediction with transformer-based deep learning model
    Khan, Salman
    Noor, Sumaiya
    Awan, Hamid Hussain
    Iqbal, Shehryar
    AlQahtani, Salman A.
    Dilshad, Naqqash
    Ahmad, Nijad
    BMC Bioinformatics, 2025, 26 (01)
  • [44] CRISPert: A Transformer-Based Model for CRISPR-Cas Off-Target Prediction
    Pargeter, William Jobson
    Backofen, Rolf
    Tran, Van Dinh
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES: RESEARCH TRACK, PT VII, ECML PKDD 2024, 2024, 14947 : 92 - 104
  • [45] Long-term Time Series Prediction of Deformation in The Area of Pylons By Combining InSAR and Transformer-based
    Wang, Zhaoran
    Bai, Xiangyu
    Proceedings - 2022 Euro-Asia Conference on Frontiers of Computer Science and Information Technology, FCSIT 2022, 2022, : 60 - 63
  • [46] Long-term Time Series Prediction of Deformation in The Area of Pylons By Combining InSAR and Transformer-based
    Wang, Zhaoran
    Bai, Xiangyu
    2022 EURO-ASIA CONFERENCE ON FRONTIERS OF COMPUTER SCIENCE AND INFORMATION TECHNOLOGY, FCSIT, 2022, : 60 - 63
  • [47] Dispersion within a model urban area
    Yuan, J
    Venkatram, A
    ATMOSPHERIC ENVIRONMENT, 2005, 39 (26) : 4729 - 4743
  • [48] RM-Transformer: A Transformer-based Model for Mandarin Speech Recognition
    Lu, Xingyu
    Hu, Jianguo
    Li, Shenhao
    Ding, Yanyu
    2022 IEEE 2ND INTERNATIONAL CONFERENCE ON COMPUTER COMMUNICATION AND ARTIFICIAL INTELLIGENCE (CCAI 2022), 2022, : 194 - 198
  • [49] Meta-learning for transformer-based prediction of potent compounds
    Chen, Hengwei
    Bajorath, Juergen
    SCIENTIFIC REPORTS, 2023, 13 (01)
  • [50] Molecular Descriptors Property Prediction Using Transformer-Based Approach
    Tran, Tuan
    Ekenna, Chinwe
    INTERNATIONAL JOURNAL OF MOLECULAR SCIENCES, 2023, 24 (15)