Hierarchical Mixture-of-Experts approach for neural compact modeling of MOSFETs

被引:5
|
作者
Park, Chanwoo [1 ]
Vincent, Premkumar [1 ]
Chong, Soogine [1 ]
Park, Junghwan [1 ]
Cha, Ye Sle [1 ]
Cho, Hyunbo [1 ]
机构
[1] Alsemy Inc, Res & Dev Ctr, 34,Seolleung Ro 90 Gil, Seoul 06193, South Korea
关键词
Neural compact model; MOSFET; Mixture-of-Experts; Artificial neural network;
D O I
10.1016/j.sse.2022.108500
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
With scaling, physics-based analytical MOSFET compact models are becoming more complex. Parameter extraction based on measured or simulated data consumes a significant time in the compact model generation process. To tackle this problem, ANN-based approaches have shown promising performance improvements in terms of accuracy and speed. However, most previous studies used a multilayer perceptron (MLP) architecture which commonly requires a large number of parameters and train data to guarantee accuracy. In this article, we present a Mixture-of-Experts approach to neural compact modeling. It is 78.43% more parameter-efficient and achieves higher accuracy using fewer data when compared to a conventional neural compact modeling approach. It also uses 43.8% less time to train, thus, demonstrating its computational efficiency.
引用
收藏
页数:5
相关论文
共 50 条
  • [1] Hierarchical mixture-of-experts models for count variables with excessive zeros
    Park, Myung Hyun
    Kim, Joseph H. T.
    [J]. COMMUNICATIONS IN STATISTICS-THEORY AND METHODS, 2022, 51 (12) : 4072 - 4096
  • [2] Spatial Mixture-of-Experts
    Dryden, Nikoli
    Hoefler, Torsten
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [3] On-line learning of a mixture-of-experts neural network
    Huh, NJ
    Oh, JH
    Kang, K
    [J]. JOURNAL OF PHYSICS A-MATHEMATICAL AND GENERAL, 2000, 33 (48): : 8663 - 8672
  • [4] Extension of mixture-of-experts networks for binary classification of hierarchical data
    Ng, Shu-Kay
    McLachlan, Geoffrey J.
    [J]. ARTIFICIAL INTELLIGENCE IN MEDICINE, 2007, 41 (01) : 57 - 67
  • [5] Practical and theoretical aspects of mixture-of-experts modeling: An overview
    Nguyen, Hien D.
    Chamroukhi, Faicel
    [J]. WILEY INTERDISCIPLINARY REVIEWS-DATA MINING AND KNOWLEDGE DISCOVERY, 2018, 8 (04)
  • [6] A mixture-of-experts approach for gene regulatory network inference
    Shao, Borong
    Lavesson, Niklas
    Boeva, Veselka
    Shahzad, Raja Khurram
    [J]. INTERNATIONAL JOURNAL OF DATA MINING AND BIOINFORMATICS, 2016, 14 (03) : 258 - 275
  • [7] HIERARCHICAL LEARNING OF SPARSE IMAGE REPRESENTATIONS USING STEERED MIXTURE-OF-EXPERTS
    Jongebloed, Rolf
    Verhack, Ruben
    Lange, Lieven
    Sikora, Thomas
    [J]. 2018 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA & EXPO WORKSHOPS (ICMEW 2018), 2018,
  • [8] Asymptotic properties of mixture-of-experts models
    Olteanu, M.
    Rynkiewicz, J.
    [J]. NEUROCOMPUTING, 2011, 74 (09) : 1444 - 1449
  • [9] Mixture-of-Experts with Expert Choice Routing
    Zhou, Yanqi
    Lei, Tao
    Liu, Hanxiao
    Du, Nan
    Huang, Yanping
    Zhao, Vincent Y.
    Dai, Andrew
    Chen, Zhifeng
    Le, Quoc
    Laudon, James
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35, NEURIPS 2022, 2022,
  • [10] Efficient Routing in Sparse Mixture-of-Experts
    [J]. Shamsolmoali, Pourya (pshams55@gmail.com), 1600, Institute of Electrical and Electronics Engineers Inc.