Correlated Bayesian Co-Training for Virtual Metrology

被引:2
|
作者
Nguyen, Cuong [1 ]
Li, Xin [2 ]
Blanton, Shawn [3 ]
Li, Xiang [4 ]
机构
[1] Inst Infocomm Res, Machine Intellect Dept, Singapore, Singapore
[2] Duke Univ, Dept Elect & Comp Engn, Durham, NC 27708 USA
[3] Carnegie Mellon Univ, Dept Elect & Comp Engn, Pittsburgh, PA 15213 USA
[4] Singapore Inst Mfg Technol, Mfg Syst Res Grp, Singapore, Singapore
关键词
Co-training; multi-state regression; semisupervised learning; REGRESSION;
D O I
10.1109/TSM.2022.3217350
中图分类号
T [工业技术];
学科分类号
08 ;
摘要
A rising challenge in manufacturing data analysis is training robust regression models using limited labeled data. In this work, we investigate a semi-supervised regression scenario, where a manufacturing process operates on multiple mutually correlated states. We exploit this inter-state correlation to improve regression accuracy by developing a novel co-training method, namely Correlated Bayesian Co-training (CBCT). CBCT adopts a block Sparse Bayesian Learning framework to enhance multiple individual regression models which share the same support. Additionally, CBCT casts a unified prior distribution on both the coefficient magnitude and the inter-state correlation. The model parameters are estimated using maximum-a-posteriori estimation (MAP), while hyper-parameters are estimated using the expectation-maximization (EM) algorithm. Experimental results from two industrial examples shows that CBCT successfully leverages inter-state correlation to reduce the modeling error by up to 79.40%, compared to other conventional approaches. This suggests that CBCT is of great value to multi-state manufacturing applications.
引用
收藏
页码:28 / 36
页数:9
相关论文
共 50 条
  • [21] CO-TRAINING SUCCEEDS IN COMPUTATIONAL PARALINGUISTICS
    Zhang, Zixing
    Deng, Jun
    Schuller, Bjoern
    [J]. 2013 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2013, : 8505 - 8509
  • [22] PAC generalization bounds for co-training
    Dasgupta, S
    Littman, ML
    McAllester, D
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 14, VOLS 1 AND 2, 2002, 14 : 375 - 382
  • [23] ENCOUNTER CO-TRAINING - BENEFITS AND PITFALLS
    LUNDBERG, C
    LUNDBERG, J
    [J]. TRAINING AND DEVELOPMENT JOURNAL, 1974, 28 (10): : 20 - 26
  • [24] Hand Posture Recognition with Co-Training
    Fang, Yikai
    Cheng, Jian
    Wang, Jinqiao
    Wang, Kongqiao
    Liu, Jing
    Lu, Hanqing
    [J]. 19TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION, VOLS 1-6, 2008, : 514 - +
  • [25] Co-Training for Handwritten Word Recognition
    Frinken, Volkmar
    Fischer, Andreas
    Bunke, Horst
    Fornes, Alicia
    [J]. 11TH INTERNATIONAL CONFERENCE ON DOCUMENT ANALYSIS AND RECOGNITION (ICDAR 2011), 2011, : 314 - 318
  • [26] Co-training with relevant random subspaces
    Yaslan, Yusuf
    Cataltepe, Zehra
    [J]. NEUROCOMPUTING, 2010, 73 (10-12) : 1652 - 1661
  • [27] Clustering with Extended Constraints by Co-Training
    Okabe, Masayuki
    Yamada, Seiji
    [J]. 2012 IEEE/WIC/ACM INTERNATIONAL CONFERENCE ON WEB INTELLIGENCE AND INTELLIGENT AGENT TECHNOLOGY WORKSHOPS (WI-IAT WORKSHOPS 2012), VOL 3, 2012, : 79 - 82
  • [28] Self-Paced Co-training
    Ma, Fan
    Meng, Deyu
    Xie, Qi
    Li, Zina
    Dong, Xuanyi
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 70, 2017, 70
  • [29] A Random Subspace Method for Co-training
    Wang, Jiao
    Luo, Si-wei
    Zeng, Xian-hua
    [J]. 2008 IEEE INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1-8, 2008, : 195 - 200
  • [30] The Unsymmetrical-Style Co-training
    Wang, Bin
    Zhang, Harry
    Spencer, Bruce
    Guo, Yuanyuan
    [J]. ADVANCES IN KNOWLEDGE DISCOVERY AND DATA MINING, PT I: 15TH PACIFIC-ASIA CONFERENCE, PAKDD 2011, 2011, 6634 : 100 - 111