Data Efficient Lithography Modeling with Residual Neural Networks and Transfer Learning

被引:18
|
作者
Lin, Yibo [1 ]
Watanabe, Yuki [2 ]
Kimura, Taiki [2 ]
Matsunawa, Tetsuaki [2 ]
Nojima, Shigeki [2 ]
Li, Meng [1 ]
Pan, David Z. [1 ]
机构
[1] Univ Texas Austin, Austin, TX 78712 USA
[2] Toshiba Memory Corp, Tokyo, Japan
关键词
D O I
10.1145/3177540.3178242
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Lithography simulation is one of the key steps in physical verification, enabled by the substantial optical and resist models. A resist model bridges the aerial image simulation to printed patterns. While the effectiveness of learning-based solutions for resist modeling has been demonstrated, they are considerably data-demanding. Meanwhile, a set of manufactured data for a specific lithography configuration is only valid for the training of one single model, indicating low data efficiency. Due to the complexity of the manufacturing process, obtaining enough data for acceptable accuracy becomes very expensive in terms of both time and cost, especially during the evolution of technology generations when the design space is intensively explored. In this work, we propose a new resist modeling framework for contact layers that utilizes existing data from old technology nodes to reduce the amount of data required from a target lithography configuration. Our framework based on residual neural networks and transfer learning techniques is effective within a competitive range of accuracy, i.e., 2-10X reduction on the amount of training data with comparable accuracy to the state-of-the-art learning approach.
引用
收藏
页码:82 / 89
页数:8
相关论文
共 50 条
  • [41] TinyGNN: Learning Efficient Graph Neural Networks
    Yan, Bencheng
    Wang, Chaokun
    Guo, Gaoyang
    Lou, Yunkai
    KDD '20: PROCEEDINGS OF THE 26TH ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY & DATA MINING, 2020, : 1848 - 1856
  • [42] Deep Learning Neural Networks and Bayesian Neural Networks in Data Analysis
    Chernoded, Andrey
    Dudko, Lev
    Myagkov, Igor
    Volkov, Petr
    XXIII INTERNATIONAL WORKSHOP HIGH ENERGY PHYSICS AND QUANTUM FIELD THEORY (QFTHEP 2017), 2017, 158
  • [43] DSNNs:learning transfer from deep neural networks to spiking neural networks
    张磊
    Du Zidong
    Li Ling
    Chen Yunji
    HighTechnologyLetters, 2020, 26 (02) : 136 - 144
  • [44] DSNNs: learning transfer from deep neural networks to spiking neural networks
    Zhang L.
    Du Z.
    Li L.
    Chen Y.
    High Technology Letters, 2020, 26 (02): : 136 - 144
  • [45] Adaptive Transfer Learning on Graph Neural Networks
    Han, Xueting
    Huang, Zhenhuan
    An, Bang
    Bai, Jing
    KDD '21: PROCEEDINGS OF THE 27TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY & DATA MINING, 2021, : 565 - 574
  • [46] Investigating Transfer Learning in Graph Neural Networks
    Kooverjee, Nishai
    James, Steven
    van Zyl, Terence
    ELECTRONICS, 2022, 11 (08)
  • [47] Improved Identification of Forest Types in the Loess Plateau Using Multi-Source Remote Sensing Data, Transfer Learning, and Neural Residual Networks
    Zhang, Mei
    Yin, Daihao
    Li, Zhen
    Zhao, Zhong
    REMOTE SENSING, 2024, 16 (12)
  • [48] Communication efficient distributed learning of neural networks in Big Data environments using Spark
    Alkhoury, Fouad
    Wegener, Dennis
    Sylla, Karl-Heinz
    Mock, Michael
    2021 IEEE INTERNATIONAL CONFERENCE ON BIG DATA (BIG DATA), 2021, : 3871 - 3877
  • [49] Efficient Learning of Healthcare Data from IoT Devices by Edge Convolution Neural Networks
    He, Yan
    Fu, Bin
    Yu, Jian
    Li, Renfa
    Jiang, Rucheng
    APPLIED SCIENCES-BASEL, 2020, 10 (24): : 1 - 19
  • [50] On the efficient classification of data structures by neural networks
    Frasconi, P
    Gori, M
    Sperduti, A
    IJCAI-97 - PROCEEDINGS OF THE FIFTEENTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOLS 1 AND 2, 1997, : 1066 - 1071