Improving Uncertainty Quantification of Variance Networks by Tree-Structured Learning

被引:0
|
作者
Ma, Wenxuan [1 ]
Yan, Xing [1 ]
Zhang, Kun [1 ]
机构
[1] Renmin Univ China, Inst Stat & Big Data, Beijing 100872, Peoples R China
关键词
Statistical test; tree-structured learning; uncertainty heterogeneity; variance networks;
D O I
10.1109/TNNLS.2023.3342138
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
To improve the uncertainty quantification of variance networks, we propose a novel tree-structured local neural network model that partitions the feature space into multiple regions based on uncertainty heterogeneity. A tree is built upon giving the training data, whose leaf nodes represent different regions where region-specific neural networks are trained to predict both the mean and the variance for quantifying uncertainty. The proposed uncertainty-splitting neural regression tree (USNRT) employs novel splitting criteria. At each node, a neural network is trained on the full data first, and a statistical test for the residuals is conducted to find the best split, corresponding to the two subregions with the most significant uncertainty heterogeneity between them. USNRT is computationally friendly, because very few leaf nodes are sufficient and pruning is unnecessary. Furthermore, an ensemble version can be easily constructed to estimate the total uncertainty, including the aleatory and epistemic. On extensive UCI datasets, USNRT or its ensemble shows superior performance compared to some recent popular methods for quantifying uncertainty with variances. Through comprehensive visualization and analysis, we uncover how USNRT works and show its merits, revealing that uncertainty heterogeneity does exist in many datasets and can be learned by USNRT.
引用
收藏
页码:1 / 15
页数:15
相关论文
共 50 条
  • [31] A DIFFERENTIAL CRYPTANALYSIS OF TREE-STRUCTURED SUBSTITUTION-PERMUTATION NETWORKS
    OCONNOR, L
    IEEE TRANSACTIONS ON COMPUTERS, 1995, 44 (09) : 1150 - 1152
  • [32] Semi-supervised Learning of Tree-Structured RBF Networks Using Co-training
    Hady, Mohamed F. Abdel
    Schwenker, Friedhelm
    Palm, Guenther
    ARTIFICIAL NEURAL NETWORKS - ICANN 2008, PT I, 2008, 5163 : 79 - 88
  • [33] Tree-structured neural networks: Spatiotemporal dynamics and optimal control
    He, Jiajin
    Xiao, Min
    Zhao, Jing
    Wang, Zhengxin
    Yao, Yi
    Cao, Jinde
    NEURAL NETWORKS, 2023, 164 : 395 - 407
  • [34] Rumor Detection on Twitter with Tree-structured Recursive Neural Networks
    Ma, Jing
    Gao, Wei
    Wong, Kam-Fai
    PROCEEDINGS OF THE 56TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL), VOL 1, 2018, : 1980 - 1989
  • [35] A Tree-Structured Algorithm for Reducing Computation in Networks with Separable Basis
    Sanger, Terence D.
    NEURAL COMPUTATION, 1991, 3 (01) : 67 - 78
  • [36] NML Computation Algorithms for Tree-Structured Multinomial Bayesian Networks
    Kontkanen, Petri
    Wettig, Hannes
    Myllymaki, Petri
    EURASIP JOURNAL ON BIOINFORMATICS AND SYSTEMS BIOLOGY, 2007, (01)
  • [37] Semi-supervised learning for tree-structured ensembles of RBF networks with Co-Training
    Hady, Mohamed Farouk Abdel
    Schwenker, Friedhelm
    Palm, Guenther
    NEURAL NETWORKS, 2010, 23 (04) : 497 - 509
  • [38] TREE-STRUCTURED SURVIVAL ANALYSIS
    GORDON, L
    OLSHEN, RA
    CANCER TREATMENT REPORTS, 1985, 69 (10): : 1065 - 1069
  • [39] Tree-structured Bayesian network learning with application to scene classification
    Wang, Z. F.
    Wang, Z. H.
    Xie, W. J.
    ELECTRONICS LETTERS, 2011, 47 (09) : 540 - 541
  • [40] Tree-structured Haar transforms
    Egiazarian, K
    Astola, J
    JOURNAL OF MATHEMATICAL IMAGING AND VISION, 2002, 16 (03) : 269 - 279