Large-Scale Training in Neural Compact Models for Accurate and Adaptable MOSFET Simulation

被引:0
|
作者
Park, Chanwoo [1 ]
Lee, Seungjun [1 ]
Park, Junghwan [1 ]
Rim, Kyungjin [1 ]
Park, Jihun [1 ]
Cho, Seonggook [1 ]
Jeon, Jongwook [2 ]
Cho, Hyunbo [1 ]
机构
[1] Alsemy Inc, Res & Dev Ctr, Seoul 06154, South Korea
[2] Sungkyunkwan Univ, Sch Elect & Elect Engn, Suwon 03063, South Korea
关键词
Integrated circuit modeling; Adaptation models; Data models; Mathematical models; Capacitance-voltage characteristics; Predictive models; MOSFET; Compact model; DTCO; foundation model; neural network;
D O I
10.1109/JEDS.2024.3417521
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
We address the challenges associated with traditional analytical models, such as BSIM, in semiconductor device modeling. These models often face limitations in accurately representing the complex behaviors of miniaturized devices. As an alternative, Neural Compact Models (NCMs) offer improved modeling capabilities, but their effectiveness is constrained by a reliance on extensive datasets for accurate performance. In real-world scenarios, where measurements for device modeling are often limited, this dependence becomes a significant hindrance. In response, this work presents a large-scale pre-training approach for NCMs. By utilizing extensive datasets across various technology nodes, our method enables NCMs to develop a more detailed understanding of device behavior, thereby enhancing the accuracy and adaptability of MOSFET device simulations, particularly when data availability is limited. Our study illustrates the potential benefits of large-scale pre-training in enhancing the capabilities of NCMs, offering a practical solution to one of the key challenges in current device modeling practices.
引用
收藏
页码:745 / 751
页数:7
相关论文
共 50 条
  • [31] Validation of laparoscopy and flexible ureteroscopy tasks in inanimate simulation training models at a large-scale conference setting
    Jirong Lu
    Karthik Thandapani
    Tricia Kuo
    Ho Yee Tiong
    Asian Journal of Urology, 2021, (02) : 215 - 219
  • [32] LARGE-SCALE SIMULATION-MODELS IN POPULATION AND DEVELOPMENT - COMMENT
    BLANDY, R
    POPULATION AND DEVELOPMENT REVIEW, 1977, 3 (1-2) : 123 - 125
  • [33] UNCERTAINTY QUANTIFICATION OF LARGE-SCALE HEALTH ECONOMIC SIMULATION MODELS
    Zheng, P.
    Dinh, T.
    VALUE IN HEALTH, 2013, 16 (07) : A595 - A595
  • [34] Large-scale TCP models using optimistic parallel simulation
    Yuan, G
    Carothers, CD
    Kalyanaraman, S
    SEVENTEENTH WORKSHOP ON PARALLEL AND DISTRIBUTED SIMULATION (PADS 2003), PROCEEDINGS, 2003, : 153 - 162
  • [35] Materialized community ground models for large-scale earthquake simulation
    Schlosser, Steven W.
    Ryant, Michael P.
    Taborda, Ricardo
    Lopez, Julio
    O'Hallaron, David R.
    Bielak, Jacobo
    INTERNATIONAL CONFERENCE FOR HIGH PERFORMANCE COMPUTING, NETWORKING, STORAGE AND ANALYSIS, 2008, : 557 - +
  • [36] MODULARIZATION GUIDELINES IN THE DEVELOPMENT OF LARGE-SCALE SYSTEM MODELS FOR SIMULATION
    GONZALEZ, S
    MENDEZ, E
    KUHLMANN, F
    CASTELAZO, I
    IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS, 1985, 15 (05): : 665 - 669
  • [37] LARGE-SCALE SIMULATION-MODELS IN POPULATION AND DEVELOPMENT - REPLY
    ARTHUR, WB
    MCNICOLL, G
    POPULATION AND DEVELOPMENT REVIEW, 1977, 3 (1-2) : 126 - 127
  • [38] Chimera: Efficiently Training Large-Scale Neural Networks with Bidirectional Pipelines
    Li, Shigang
    Hoefler, Torsten
    SC21: INTERNATIONAL CONFERENCE FOR HIGH PERFORMANCE COMPUTING, NETWORKING, STORAGE AND ANALYSIS, 2021,
  • [39] Training Convolutional Neural Network for Sketch Recognition on Large-Scale Dataset
    Zhou, Wen
    Jia, Jinyuan
    INTERNATIONAL ARAB JOURNAL OF INFORMATION TECHNOLOGY, 2020, 17 (01) : 82 - 89
  • [40] Accelerating Large-Scale Distributed Neural Network Training with SPMD Parallelism
    Zhang, Shiwei
    Diao, Lansong
    Wu, Chuan
    Wang, Siyu
    Lin, Wei
    PROCEEDINGS OF THE 13TH SYMPOSIUM ON CLOUD COMPUTING, SOCC 2022, 2022, : 403 - 418