Data Efficient Lithography Modeling with Residual Neural Networks and Transfer Learning

被引:18
|
作者
Lin, Yibo [1 ]
Watanabe, Yuki [2 ]
Kimura, Taiki [2 ]
Matsunawa, Tetsuaki [2 ]
Nojima, Shigeki [2 ]
Li, Meng [1 ]
Pan, David Z. [1 ]
机构
[1] Univ Texas Austin, Austin, TX 78712 USA
[2] Toshiba Memory Corp, Tokyo, Japan
关键词
D O I
10.1145/3177540.3178242
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Lithography simulation is one of the key steps in physical verification, enabled by the substantial optical and resist models. A resist model bridges the aerial image simulation to printed patterns. While the effectiveness of learning-based solutions for resist modeling has been demonstrated, they are considerably data-demanding. Meanwhile, a set of manufactured data for a specific lithography configuration is only valid for the training of one single model, indicating low data efficiency. Due to the complexity of the manufacturing process, obtaining enough data for acceptable accuracy becomes very expensive in terms of both time and cost, especially during the evolution of technology generations when the design space is intensively explored. In this work, we propose a new resist modeling framework for contact layers that utilizes existing data from old technology nodes to reduce the amount of data required from a target lithography configuration. Our framework based on residual neural networks and transfer learning techniques is effective within a competitive range of accuracy, i.e., 2-10X reduction on the amount of training data with comparable accuracy to the state-of-the-art learning approach.
引用
收藏
页码:82 / 89
页数:8
相关论文
共 50 条
  • [31] Residual Echo State Networks: Residual recurrent neural networks with stable dynamics and fast learning
    Ceni, Andrea
    Gallicchio, Claudio
    NEUROCOMPUTING, 2024, 597
  • [32] Neural networks learning differential data
    Masuoka, R
    IEICE TRANSACTIONS ON INFORMATION AND SYSTEMS, 2000, E83D (06) : 1291 - 1300
  • [33] Learning deep hierarchical and temporal recurrent neural networks with residual learning
    Tehseen Zia
    Assad Abbas
    Usman Habib
    Muhammad Sajid Khan
    International Journal of Machine Learning and Cybernetics, 2020, 11 : 873 - 882
  • [34] Learning deep hierarchical and temporal recurrent neural networks with residual learning
    Zia, Tehseen
    Abbas, Assad
    Habib, Usman
    Khan, Muhammad Sajid
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2020, 11 (04) : 873 - 882
  • [35] Depth Dropout: Efficient Training of Residual Convolutional Neural Networks
    Guo, Jian
    Gould, Stephen
    2016 INTERNATIONAL CONFERENCE ON DIGITAL IMAGE COMPUTING: TECHNIQUES AND APPLICATIONS (DICTA), 2016, : 343 - 349
  • [36] An Efficient Learning Method for RBF Neural Networks
    Pazouki, Maryam
    Wu, Zijun
    Yang, Zhixing
    Moeller, Dietmar P. F.
    2015 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2015,
  • [37] ON A CLASS OF EFFICIENT LEARNING ALGORITHMS FOR NEURAL NETWORKS
    BARMANN, F
    BIEGLERKONIG, F
    NEURAL NETWORKS, 1992, 5 (01) : 139 - 144
  • [38] Efficient learning of neural networks with evolutionary algorithms
    Siebel, Nils T.
    Krause, Jochen
    Sommer, Gerald
    PATTERN RECOGNITION, PROCEEDINGS, 2007, 4713 : 466 - +
  • [39] EFFICIENT LEARNING ALGORITHMS FOR NEURAL NETWORKS (ELEANNE)
    KARAYIANNIS, NB
    VENETSANOPOULOS, AN
    IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS, 1993, 23 (05): : 1372 - 1383
  • [40] Distilled Neural Networks for Efficient Learning to Rank
    Nardini, Franco Maria
    Rulli, Cosimo
    Trani, Salvatore
    Venturini, Rossano
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2023, 35 (05) : 4695 - 4712