Data Efficient Lithography Modeling with Residual Neural Networks and Transfer Learning

被引:18
|
作者
Lin, Yibo [1 ]
Watanabe, Yuki [2 ]
Kimura, Taiki [2 ]
Matsunawa, Tetsuaki [2 ]
Nojima, Shigeki [2 ]
Li, Meng [1 ]
Pan, David Z. [1 ]
机构
[1] Univ Texas Austin, Austin, TX 78712 USA
[2] Toshiba Memory Corp, Tokyo, Japan
关键词
D O I
10.1145/3177540.3178242
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Lithography simulation is one of the key steps in physical verification, enabled by the substantial optical and resist models. A resist model bridges the aerial image simulation to printed patterns. While the effectiveness of learning-based solutions for resist modeling has been demonstrated, they are considerably data-demanding. Meanwhile, a set of manufactured data for a specific lithography configuration is only valid for the training of one single model, indicating low data efficiency. Due to the complexity of the manufacturing process, obtaining enough data for acceptable accuracy becomes very expensive in terms of both time and cost, especially during the evolution of technology generations when the design space is intensively explored. In this work, we propose a new resist modeling framework for contact layers that utilizes existing data from old technology nodes to reduce the amount of data required from a target lithography configuration. Our framework based on residual neural networks and transfer learning techniques is effective within a competitive range of accuracy, i.e., 2-10X reduction on the amount of training data with comparable accuracy to the state-of-the-art learning approach.
引用
收藏
页码:82 / 89
页数:8
相关论文
共 50 条
  • [1] Data Efficient Lithography Modeling With Transfer Learning and Active Data Selection
    Li, Yibo
    Li, Meng
    Watanabe, Yuki
    Kimura, Taiki
    Matsunawa, Tetsuaki
    Nojima, Shigeki
    Pan, David Z.
    IEEE TRANSACTIONS ON COMPUTER-AIDED DESIGN OF INTEGRATED CIRCUITS AND SYSTEMS, 2019, 38 (10) : 1900 - 1913
  • [2] Data-Efficient Classification of Birdcall Through Convolutional Neural Networks Transfer Learning
    Efremova, Dina B.
    Sankupellay, Mangalam
    Konovalov, Dmitry A.
    2019 DIGITAL IMAGE COMPUTING: TECHNIQUES AND APPLICATIONS (DICTA), 2019, : 294 - 301
  • [3] Transfer learning for data-efficient abdominal muscle segmentation with convolutional neural networks
    McSweeney, Donal M.
    Henderson, Edward G.
    van Herk, Marcel
    Weaver, Jamie
    Bromiley, Paul A.
    Green, Andrew
    McWilliam, Alan
    MEDICAL PHYSICS, 2022, 49 (05) : 3107 - 3120
  • [4] Multidimensional Residual Learning Based on Recurrent Neural Networks for Acoustic Modeling
    Zhao, Yuanyuan
    Xu, Shuang
    Xu, Bo
    17TH ANNUAL CONFERENCE OF THE INTERNATIONAL SPEECH COMMUNICATION ASSOCIATION (INTERSPEECH 2016), VOLS 1-5: UNDERSTANDING SPEECH PROCESSING IN HUMANS AND MACHINES, 2016, : 3419 - 3423
  • [5] Transfer Learning with Efficient Convolutional Neural Networks for Fruit Recognition
    Huang, Ziliang
    Cao, Yan
    Wang, Tianbao
    PROCEEDINGS OF 2019 IEEE 3RD INFORMATION TECHNOLOGY, NETWORKING, ELECTRONIC AND AUTOMATION CONTROL CONFERENCE (ITNEC 2019), 2019, : 358 - 362
  • [6] Transfer Learning from Unlabeled Data via Neural Networks
    Huaxiang Zhang
    Hua Ji
    Xiaoqin Wang
    Neural Processing Letters, 2012, 36 : 173 - 187
  • [7] Transfer Learning from Unlabeled Data via Neural Networks
    Zhang, Huaxiang
    Ji, Hua
    Wang, Xiaoqin
    NEURAL PROCESSING LETTERS, 2012, 36 (02) : 173 - 187
  • [8] Computationally Efficient Training of Deep Neural Networks via Transfer Learning
    Oyen, Diane
    REAL-TIME IMAGE PROCESSING AND DEEP LEARNING 2019, 2019, 10996
  • [9] Efficient transfer learning for multi-channel convolutional neural networks
    de La Comble, Alois
    Prepin, Ken
    PROCEEDINGS OF 17TH INTERNATIONAL CONFERENCE ON MACHINE VISION APPLICATIONS (MVA 2021), 2021,
  • [10] Filter Pruning for Efficient Transfer Learning in Deep Convolutional Neural Networks
    Reinhold, Caique
    Roisenberg, Mauro
    ARTIFICIAL INTELLIGENCEAND SOFT COMPUTING, PT I, 2019, 11508 : 191 - 202