ToPs: Ensemble Learning With Trees of Predictors

被引:9
|
作者
Yoon, Jinsung [1 ]
Zame, William R. [2 ,3 ]
van der Schaar, Mihaela [4 ]
机构
[1] Univ Calif Los Angeles, Dept Elect Engn, Los Angeles, CA 90095 USA
[2] Univ Calif Los Angeles, Dept Math, Los Angeles, CA 90095 USA
[3] Univ Calif Los Angeles, Dept Econ, Los Angeles, CA 90095 USA
[4] Univ Oxford, Dept Engn Sci, Oxford OX1 3PJ, England
关键词
Ensemble learning; model tree; personalized predictive models; REGRESSION;
D O I
10.1109/TSP.2018.2807402
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
We present a new approach to ensemble learning. Our approach differs from previous approaches in that it constructs and applies different predictive models to different subsets of the feature space. It does this by constructing a tree of subsets of the feature space and associating a predictor (predictive model) to each node of the tree; we call the resulting object a tree of predictors. The (locally) optimal tree of predictors is derived recursively; each step involves jointly optimizing the split of the terminal nodes of the previous tree and the choice of learner (from among a given set of base learners) and training set-hence predictor-for each set in the split. The features of a new instance determine a unique path through the optimal tree of predictors; the final prediction aggregates the predictions of the predictors along this path. Thus, our approach uses base learners to create complex learners that are matched to the characteristics of the data set while avoiding overfitting. We establish loss bounds for the final predictor in terms of the Rademacher complexity of the base learners. We report the results of a number of experiments on a variety of datasets, showing that our approach provides statistically significant improvements over a wide variety of state-of-the-art machine learning algorithms, including various ensemble learning methods.
引用
收藏
页码:2141 / 2152
页数:12
相关论文
共 50 条
  • [31] Learning concept-drifting data streams with random ensemble decision trees
    Li, Peipei
    Wu, Xindong
    Hu, Xuegang
    Wang, Hao
    NEUROCOMPUTING, 2015, 166 : 68 - 83
  • [32] Modeling of Gaussian Beam Ensemble Flat-tops for Applications
    Sukuta, Sydney
    Huynh, Thao T.
    LASER RESONATORS AND BEAM CONTROL XI, 2009, 7194
  • [33] Selective ensemble of decision trees
    Zhou, ZH
    Tang, W
    ROUGH SETS, FUZZY SETS, DATA MINING, AND GRANULAR COMPUTING, 2003, 2639 : 476 - 483
  • [34] ENSEMBLE MOTION PLANNING IN TREES
    FREDERICKSON, GN
    GUAN, DJ
    30TH ANNUAL SYMPOSIUM ON FOUNDATIONS OF COMPUTER SCIENCE, 1989, : 66 - 71
  • [35] Seeing the forest through the trees - Learning a comprehensible model from a first order ensemble
    Van Assche, Anneleen
    Blockeel, Hendrik
    INDUCTIVE LOGIC PROGRAMMING, 2008, 4894 : 269 - 279
  • [36] Stacking-based ensemble learning of decision trees for interpretable prostate cancer detection
    Wang, Yuyan
    Wang, Dujuan
    Geng, Na
    Wang, Yanzhang
    Yin, Yunqiang
    Jin, Yaochu
    APPLIED SOFT COMPUTING, 2019, 77 : 188 - 204
  • [37] Ensemble Trees Learning Based Improved Predictive Maintenance using IIoT for Turbofan Engines
    Behera, Sourajit
    Choubey, Anurag
    Kanani, Chandresh S.
    Patel, Yashwant Singh
    Misra, Rajiv
    Sillitti, Alberto
    SAC '19: PROCEEDINGS OF THE 34TH ACM/SIGAPP SYMPOSIUM ON APPLIED COMPUTING, 2019, : 842 - 850
  • [38] Prediction of Store Demands by Decision Trees and Recurrent Neural Networks Ensemble with Transfer Learning
    Peric, Nikica
    Munitic, Naomi-Frida
    Basljan, Ivana
    Lesic, Vinko
    ICAART: PROCEEDINGS OF THE 14TH INTERNATIONAL CONFERENCE ON AGENTS AND ARTIFICIAL INTELLIGENCE - VOL 3, 2022, : 218 - 225
  • [39] Ensemble learning from model based trees with application to differential price sensitivity assessment
    Arevalillo, Jorge M.
    INFORMATION SCIENCES, 2021, 557 : 16 - 33
  • [40] Ensemble Tracking Based on Randomized Trees
    Gu Xingfang
    Mao Yaobin
    Kong Jianshou
    PROCEEDINGS OF THE 31ST CHINESE CONTROL CONFERENCE, 2012, : 3818 - 3823