An End-to-End Neural Network for Complex Electromagnetic Simulations

被引:8
|
作者
Zhai, Menglin [1 ,2 ]
Chen, Yaobo [1 ,2 ]
Xu, Longting [1 ,2 ]
Yin, Wen-Yan [3 ]
机构
[1] Donghua Univ, Coll Informat Sci & Technol, Shanghai, Peoples R China
[2] Minist Educ, Engn Res Ctr Digitized Text & Apparel Technol, Shanghai 201620, Peoples R China
[3] Zhejiang Univ, Innovat Inst Electromagnet Informat & Elect Integr, Coll Informat Sci & Elect Engn, Key Lab Adv Micro, Hangzhou 310027, Peoples R China
来源
基金
中国国家自然科学基金;
关键词
Electromagnetic forward simulation; finite-difference time-domain (FDTD); neural network; SOLVER;
D O I
10.1109/LAWP.2023.3294499
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Although many numerical methods can accurately solve time-domain electromagnetic (EM) simulation problems, such as finite-difference time-domain (FDTD), the computational demands are usually significant for complex scenarios. In this letter, we investigate the feasibility of applying deep learning technology to accurately solving EM forward simulations. Based on an end-to-end neural network framework, a convolutional neural network is used to extract features of scatters, and a long short-term memory neural network is used to predict EM distributions. To ensure the accuracy of the framework, especially when dealing with complicated phenomena, principal component analysis is employed to compress data sets before training. Numerical experiments show that the proposed scheme can predict EM field distributions efficiently and accurately for complex scenarios containing scatters of different materials, locations, geometrical shapes, and random numbers. The average relative mean square error is around 8. 05e-5 for scenarios with certain number of scatters and 2.25e-4 for random number of scatters respectively, which outperforms other neural network frameworks. Meanwhile, compared with FDTD, the time speedup is around 1528 times.
引用
收藏
页码:2522 / 2526
页数:5
相关论文
共 50 条
  • [41] Absorption Attenuation Compensation Using an End-to-End Deep Neural Network
    Zhou, Chen
    Wang, Shoudong
    Wang, Zixu
    Cheng, Wanli
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2022, 60
  • [42] An End-to-End Cascaded Image Deraining and Object Detection Neural Network
    Wang, Kaige
    Wang, Tianming
    Qu, Jianchuang
    Jiang, Huatao
    Li, Qing
    Chang, Lin
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2022, 7 (04) : 9541 - 9548
  • [43] End-to-end Delay Prediction by Neural Network based on Chaos Theory
    Sun, Hanlin
    Jin, Yuehui
    Cui, Yidong
    Cheng, Shiduan
    2010 6TH INTERNATIONAL CONFERENCE ON WIRELESS COMMUNICATIONS NETWORKING AND MOBILE COMPUTING (WICOM), 2010,
  • [44] Forecasting SDN End-to-End Latency Using Graph Neural Network
    Ge, Zhun
    Hou, Jiacheng
    Nayak, Amiya
    2023 INTERNATIONAL CONFERENCE ON INFORMATION NETWORKING, ICOIN, 2023, : 293 - 298
  • [45] End-to-End Musical Key Estimation Using a Convolutional Neural Network
    Korzeniowski, Filip
    Widmer, Gerhard
    2017 25TH EUROPEAN SIGNAL PROCESSING CONFERENCE (EUSIPCO), 2017, : 966 - 970
  • [46] Lightweight end-to-end image steganalysis based on convolutional neural network
    Wang, Qun
    Zhang, Minqing
    Li, Jun
    Kong, Yongjun
    JOURNAL OF ELECTRONIC IMAGING, 2021, 30 (06)
  • [47] End-to-End Multispectral Image Compression Using Convolutional Neural Network
    Kong Fanqiang
    Zhou Yongbo
    Shen Qiu
    Wen Keyao
    CHINESE JOURNAL OF LASERS-ZHONGGUO JIGUANG, 2019, 46 (10):
  • [48] DeepTP: An End-to-End Neural Network for Mobile Cellular Traffic Prediction
    Feng, Jie
    Chen, Xinlei
    Gao, Rundong
    Zeng, Ming
    Li, Yong
    IEEE NETWORK, 2018, 32 (06): : 108 - 115
  • [49] End-to-end neural network architecture for fraud scoring in card payments
    Ander Gomez, Jon
    Arevalo, Juan
    Paredes, Roberto
    Nin, Jordi
    PATTERN RECOGNITION LETTERS, 2018, 105 : 175 - 181
  • [50] FOSNet: An End-to-End Trainable Deep Neural Network for Scene Recognition
    Seong, Hongje
    Hyun, Junhyuk
    Kim, Euntai
    IEEE ACCESS, 2020, 8 (08) : 82066 - 82077