ISFNN: an enhanced neural network for parametric modeling of passive devices with input skip-connections

被引:0
|
作者
机构
[1] Ren, Yimin
[2] Deng, Xiaojiao
[3] Zheng, Xiaoping
关键词
Parametric devices;
D O I
10.1007/s10489-024-05853-9
中图分类号
学科分类号
摘要
High-performance passive devices play a critical role in the radio frequency front-end of wireless systems. Accurately characterizing the electromagnetic (EM) responses of these devices poses a formidable challenge, particularly in high-frequency design. Commercial numerical methods often demand substantial computational resources and necessitate complete recalculation for any structural modifications. This paper proposes an input skip-connections feedforward neural network (ISFNN) for the parametric modeling of passive devices. The ISFNN architecture incorporates multiple skip connections within the layer blocks, which periodically integrate the input design variables into the intermediate hidden layers. This design facilitates feature combination and enhances the extraction capability of design variables. The intermediate layers contain both the original input features and the learned feature information from preceding layers, enabling the model to effectively and robustly capture the nonlinear relationships between the design variables and EM responses. Additionally, a systematic algorithm is proposed to develop and train the ISFNN model. The ISFNN offers a unified solution for both single-physics EM analysis and EM-centric multi-physics (MP) analysis. Compared to other ANN-based models, the ISFNN achieves smaller testing errors with fewer training samples for single-physics EM modeling. Furthermore, in certain applications, conducting MP analysis is more aligned with the actual operating conditions of high-performance microwave components. The nongeometrical parameters are incorporated into the input design features. The ISFNN accurately predicts EM responses using only MP training data, without requiring additional single-physics EM data, transfer functions, or multiple mapping modules. Validation examples demonstrate the better modeling accuracy and efficiency of the ISFNN. Within the design space, the trained ISFNN can generate new MP data in just 0.02 seconds per calculation, significantly reducing time costs while maintaining high modeling precision. © The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature 2024.
引用
收藏
相关论文
共 50 条
  • [1] A Lightweight Super-resolution Network with Skip-connections
    Wu X.
    Dai P.
    Lu S.
    Luo Z.
    Sun J.
    Yuan K.
    [J]. Current Medical Imaging, 2024, 20
  • [2] Sparse neural networks with skip-connections for identification of aluminum electrolysis cell
    Lundby, Erlend Torje Berg
    Robinson, Haakon
    Rasheed, Adil
    Halvorsen, Ivar Johan
    Gravdahl, Jan Tommy
    [J]. 2023 62ND IEEE CONFERENCE ON DECISION AND CONTROL, CDC, 2023, : 5506 - 5513
  • [3] On The Energy Statistics of Feature Maps in Pruning of Neural Networks with Skip-Connections
    Soltani, Mohammadreza
    Wu, Suya
    Li, Yuerong
    Ding, Jie
    Tarokh, Vahid
    [J]. DCC 2022: 2022 DATA COMPRESSION CONFERENCE (DCC), 2022, : 482 - 482
  • [4] Fine-grained visual categorization of butterfly specimens at sub-species level via a convolutional neural network with skip-connections
    Lin, Zhongqi
    Jia, Jingdun
    Gao, Wanlin
    Huang, Feng
    [J]. NEUROCOMPUTING, 2020, 384 : 295 - 313
  • [5] Physics-Informed Neural Networks with skip connections for modeling and
    Kittelsen, Jonas Ekeland
    Antonelo, Eric Aislan
    Camponogara, Eduardo
    Imsland, Lars Struen
    [J]. APPLIED SOFT COMPUTING, 2024, 158
  • [6] Speech Enhancement using Convolutional Neural Network with Skip Connections
    Shi, Yupeng
    Rong, Weicong
    Zheng, Nengheng
    [J]. 2018 11TH INTERNATIONAL SYMPOSIUM ON CHINESE SPOKEN LANGUAGE PROCESSING (ISCSLP), 2018, : 6 - 10
  • [7] A Novel Deep Neural Network Topology for Parametric Modeling of Passive Microwave Components
    Jin, Jing
    Feng, Feng
    Zhang, Jianan
    Yan, Shuxia
    Na, Weicong
    Zhang, Qijun
    [J]. IEEE ACCESS, 2020, 8 : 82273 - 82285
  • [8] Skip Connections in Spiking Neural Networks: An Analysis of Their Effect on Network Training
    Benmeziane, Hadjer
    Ounnoughene, Amine Ziad
    Hamzaoui, Imane
    Bouhadjar, Younes
    [J]. 2023 IEEE INTERNATIONAL PARALLEL AND DISTRIBUTED PROCESSING SYMPOSIUM WORKSHOPS, IPDPSW, 2023, : 790 - 794
  • [9] Neural Architecture Search for a Highly Efficient Network with Random Skip Connections
    Shan, Dongjing
    Zhang, Xiongwei
    Shi, Wenhua
    Li, Li
    [J]. APPLIED SCIENCES-BASEL, 2020, 10 (11):
  • [10] Why Is Everyone Training Very Deep Neural Network With Skip Connections?
    Oyedotun, Oyebade K.
    Al Ismaeil, Kassem
    Aouada, Djamila
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, 34 (09) : 5961 - 5975