Data Efficient Lithography Modeling with Residual Neural Networks and Transfer Learning

被引:18
|
作者
Lin, Yibo [1 ]
Watanabe, Yuki [2 ]
Kimura, Taiki [2 ]
Matsunawa, Tetsuaki [2 ]
Nojima, Shigeki [2 ]
Li, Meng [1 ]
Pan, David Z. [1 ]
机构
[1] Univ Texas Austin, Austin, TX 78712 USA
[2] Toshiba Memory Corp, Tokyo, Japan
关键词
D O I
10.1145/3177540.3178242
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Lithography simulation is one of the key steps in physical verification, enabled by the substantial optical and resist models. A resist model bridges the aerial image simulation to printed patterns. While the effectiveness of learning-based solutions for resist modeling has been demonstrated, they are considerably data-demanding. Meanwhile, a set of manufactured data for a specific lithography configuration is only valid for the training of one single model, indicating low data efficiency. Due to the complexity of the manufacturing process, obtaining enough data for acceptable accuracy becomes very expensive in terms of both time and cost, especially during the evolution of technology generations when the design space is intensively explored. In this work, we propose a new resist modeling framework for contact layers that utilizes existing data from old technology nodes to reduce the amount of data required from a target lithography configuration. Our framework based on residual neural networks and transfer learning techniques is effective within a competitive range of accuracy, i.e., 2-10X reduction on the amount of training data with comparable accuracy to the state-of-the-art learning approach.
引用
收藏
页码:82 / 89
页数:8
相关论文
共 50 条
  • [21] Modeling Cognitive Radio Networks for Efficient Data Transfer Using Cloud Link
    Reddy, Yenumula B.
    Ellis, Stephen
    PROCEEDINGS OF THE 2013 10TH INTERNATIONAL CONFERENCE ON INFORMATION TECHNOLOGY: NEW GENERATIONS, 2013, : 525 - 530
  • [22] Differentiable neural architecture learning for efficient neural networks
    Guo, Qingbei
    Wu, Xiao-Jun
    Kittler, Josef
    Feng, Zhiquan
    PATTERN RECOGNITION, 2022, 126
  • [23] On the Impact of Data Set Size in Transfer Learning Using Deep Neural Networks
    Soekhoe, Deepak
    van der Putten, Peter
    Plaat, Aske
    ADVANCES IN INTELLIGENT DATA ANALYSIS XV, 2016, 9897 : 50 - 60
  • [24] Adaptive incremental transfer learning for efficient performance modeling of big data workloads
    Garralda-Barrio, Mariano
    Eiras-Franco, Carlos
    Bolon-Canedo, Veronica
    FUTURE GENERATION COMPUTER SYSTEMS-THE INTERNATIONAL JOURNAL OF ESCIENCE, 2025, 166
  • [25] Modeling the Influence of Data Structure on Learning in Neural Networks: The Hidden Manifold Model
    Goldt, Sebastian
    Mezard, Marc
    Krzakala, Florent
    Zdeborova, Lenka
    PHYSICAL REVIEW X, 2020, 10 (04):
  • [26] Efficient Residential Electric Load Forecasting via Transfer Learning and Graph Neural Networks
    Wu, Di
    Lin, Weixuan
    IEEE TRANSACTIONS ON SMART GRID, 2023, 14 (03) : 2423 - 2431
  • [27] Efficient Detection of Longitudinal Bacteria Fission Using Transfer Learning in Deep Neural Networks
    Garcia-Perez, Carlos
    Ito, Keiichi
    Geijo, Javier
    Feldbauer, Roman
    Schreiber, Nico
    zu Castell, Wolfgang
    FRONTIERS IN MICROBIOLOGY, 2021, 12
  • [28] Residual Recurrent Neural Networks for Learning Sequential Representations
    Yue, Boxuan
    Fu, Junwei
    Liang, Jun
    INFORMATION, 2018, 9 (03)
  • [29] Learning Efficient Residual Networks through Dense Connecting
    Song, Tianzhong
    Song, Yan
    Wang, Yongxiong
    Huang, Xuegang
    2018 37TH CHINESE CONTROL CONFERENCE (CCC), 2018, : 9181 - 9185
  • [30] Crop disease classification with transfer learning and residual networks
    Wang D.
    Wang J.
    Nongye Gongcheng Xuebao/Transactions of the Chinese Society of Agricultural Engineering, 2021, 37 (04): : 199 - 207