Online random forests regression with memories

被引:17
|
作者
Zhong, Yuan [1 ,2 ]
Yang, Hongyu [1 ]
Zhang, Yanci [1 ]
Li, Ping [2 ]
机构
[1] Sichuan Univ, Coll Comp Sci, Chengdu, Peoples R China
[2] Southwest Petr Univ, Sch Comp Sci, Chengdu, Peoples R China
基金
中国国家自然科学基金;
关键词
Random forests regression; Long-term memory; Online weight learning; Leaf-level; Adaptive learning rate; Stochastic gradient descent; ALGORITHM;
D O I
10.1016/j.knosys.2020.106058
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In recent years, the online schema of the conventional Random Forests(RFs) have attracted much attention because of its ability to handle sequential data or data whose distribution changes during the prediction process. However, most research on online RFs focuses on structural modification during the training stage, overlooking critical aspects of the sequential dataset, such as autocorrelation. In this paper, we demonstrate how to improve the predictive accuracy of the regression model by exploiting data correlation. Instead of modifying the structure of the off-line trained RFs, we endow RFs with memory during regression prediction through an online weight learning approach, which is called Online Weight Learning Random Forest Regression(OWL-RFR). Specifically, the weights of leaves are updated based on a novel adaptive stochastic gradient descent method, in which the adaptive learning rate considers the current and historical prediction bias ratios, compared with the static learning rate. Thus, leaf-level weight stores the learned information from the past data points for future correlated prediction. Compared with tree-level weight which only has immediate memory for current prediction, the leaf-level weight can provide long-term memory. Numerical experiments with OWL-RFR show remarkable improvements in predictive accuracy across several common machine learning datasets, compared to traditional RFs and other online approaches. Moreover, our results verify that the weight approach using the long-term memory of leaf-level weight is more effective than immediate dependency on tree-level weight. We show the improved effectiveness of the proposed adaptive learning rate in comparison to the static rate for most datasets, we also show the convergence and stability of our method. (C) 2020 Published by Elsevier B.V.
引用
下载
收藏
页数:10
相关论文
共 50 条
  • [1] Online Rebuilding Regression Random Forests
    Zhong, Yuan
    Yang, Hongyu
    Zhang, Yanci
    Li, Ping
    KNOWLEDGE-BASED SYSTEMS, 2021, 221
  • [2] Robustness of random forests for regression
    Roy, Marie-Helene
    Larocque, Denis
    JOURNAL OF NONPARAMETRIC STATISTICS, 2012, 24 (04) : 993 - 1006
  • [3] Covariance regression with random forests
    Cansu Alakus
    Denis Larocque
    Aurélie Labbe
    BMC Bioinformatics, 24
  • [4] Covariance regression with random forests
    Alakus, Cansu
    Larocque, Denis
    Labbe, Aurelie
    BMC BIOINFORMATICS, 2023, 24 (01)
  • [5] Mondrian Forests: Efficient Online Random Forests
    Lakshminarayanan, Balaji
    Roy, Daniel M.
    Teh, Yee Whye
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 27 (NIPS 2014), 2014, 27
  • [6] Online incremental random forests
    Osman, Hassab Elgaw
    Osamut, Hasegawa
    INTERNATIONAL CONFERENCE ON MACHINE VISION 2007, PROCEEDINGS, 2007, : 102 - +
  • [7] Regression conformal prediction with random forests
    Johansson, Ulf
    Bostrom, Henrik
    Lofstrom, Tuve
    Linusson, Henrik
    MACHINE LEARNING, 2014, 97 (1-2) : 155 - 176
  • [8] Regression conformal prediction with random forests
    Ulf Johansson
    Henrik Boström
    Tuve Löfström
    Henrik Linusson
    Machine Learning, 2014, 97 : 155 - 176
  • [9] Quantifying uncertainty in online regression forests
    Vasiloudis, Theodore
    de Francisci Morales, Gianmarco
    Boström, Henrik
    Journal of Machine Learning Research, 2019, 20
  • [10] Quantifying Uncertainty in Online Regression Forests
    Vasiloudis, Theodore
    Morales, Gianmarco De Francisci
    Bostrom, Henrik
    JOURNAL OF MACHINE LEARNING RESEARCH, 2019, 20