Rule Ensembles for Multi-Target Regression

被引:10
|
作者
Aho, Timo [1 ]
Zenko, Bernard [2 ]
Dzeroski, Saso [2 ]
机构
[1] Tampere Univ Technol, Dept Software Syst, Korkeakoulunkatu 1, FI-33101 Tampere, Finland
[2] Jozef Stefan Inst, Dept Knowledge Technol, SL-1000 Ljubljana, Slovenia
关键词
Multi-Target Prediction; Rule Learning; Regression; MODEL;
D O I
10.1109/ICDM.2009.16
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Methods for learning decision rules are being successfully applied to many problem domains, especially where understanding and interpretation of the learned model is necessary. In many real life problems, we would like to predict multiple related (nominal or numeric) target attributes simultaneously. Methods for learning rules that predict multiple targets at once already exist, but are unfortunately based on the covering algorithm, which is not very well suited for regression problems. A better solution for regression problems may be a rule ensemble approach that transcribes an ensemble of decision trees into a large collection of rules. An optimization procedure is then used for selecting the best (and much smaller) subset of these rules, and to determine their weights. Using the rule ensembles approach we have developed a new system for learning rule ensembles for multi-target regression problems. The newly developed method was extensively evaluated and the results show that the accuracy of multi-target regression rule ensembles is better than the accuracy of multi-target regression trees, but somewhat worse than the accuracy of multi-target random forests. The rules are significantly more concise than random forests, and it is also possible to create very small rule sets that are still comparable in accuracy to single regression trees.
引用
收藏
页码:21 / +
页数:3
相关论文
共 50 条
  • [1] Multi-Target Regression with Rule Ensembles
    Aho, Timo
    Zenko, Bernard
    Dzeroski, Saso
    Elomaa, Tapio
    JOURNAL OF MACHINE LEARNING RESEARCH, 2012, 13 : 2367 - 2407
  • [2] Ensembles for multi-target regression with random output selections
    Martin Breskvar
    Dragi Kocev
    Sašo Džeroski
    Machine Learning, 2018, 107 : 1673 - 1709
  • [3] Ensembles of Extremely Randomized Trees for Multi-target Regression
    Kocev, Dragi
    Ceci, Michelangelo
    DISCOVERY SCIENCE, DS 2015, 2015, 9356 : 86 - 100
  • [4] Ensembles for multi-target regression with random output selections
    Breskvar, Martin
    Kocev, Dragi
    Dzeroski, Saso
    MACHINE LEARNING, 2018, 107 (11) : 1673 - 1709
  • [5] Self-training for multi-target regression with tree ensembles
    Levatic, Jurica
    Ceci, Michelangelo
    Kocev, Dragi
    Dzeroski, Sago
    KNOWLEDGE-BASED SYSTEMS, 2017, 123 : 41 - 60
  • [6] Feature ranking for multi-target regression
    Petkovic, Matej
    Kocev, Dragi
    Dzeroski, Saso
    MACHINE LEARNING, 2020, 109 (06) : 1179 - 1204
  • [7] MTBR: Multi-Target Boosting for Regression
    Lin, Sangdi
    Azarnoush, Bahareh
    Runger, George
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2021, 33 (02) : 626 - 636
  • [8] Benchmarking multi-target regression methods
    Mastelini, Saulo Martiello
    Santana, Everton Jose
    Turrisi da Costa, Victor G.
    Barbon, Sylvio, Jr.
    2018 7TH BRAZILIAN CONFERENCE ON INTELLIGENT SYSTEMS (BRACIS), 2018, : 396 - 401
  • [9] Conditionally Decorrelated Multi-Target Regression
    Yazar, Orhan
    Elghazel, Haytham
    Hacid, Mohand-Said
    Castin, Nathalie
    NEURAL INFORMATION PROCESSING (ICONIP 2019), PT II, 2019, 11954 : 445 - 457
  • [10] Feature ranking for multi-target regression
    Matej Petković
    Dragi Kocev
    Sašo Džeroski
    Machine Learning, 2020, 109 : 1179 - 1204