Tree Induction for Probability-Based Ranking

被引:1
|
作者
Foster Provost
Pedro Domingos
机构
[1] New York University,
[2] University of Washington,undefined
来源
Machine Learning | 2003年 / 52卷
关键词
ranking; probability estimation; classification; cost-sensitive learning; decision trees; Laplace correction; bagging;
D O I
暂无
中图分类号
学科分类号
摘要
Tree induction is one of the most effective and widely used methods for building classification models. However, many applications require cases to be ranked by the probability of class membership. Probability estimation trees (PETs) have the same attractive features as classification trees (e.g., comprehensibility, accuracy and efficiency in high dimensions and on large data sets). Unfortunately, decision trees have been found to provide poor probability estimates. Several techniques have been proposed to build more accurate PETs, but, to our knowledge, there has not been a systematic experimental analysis of which techniques actually improve the probability-based rankings, and by how much. In this paper we first discuss why the decision-tree representation is not intrinsically inadequate for probability estimation. Inaccurate probabilities are partially the result of decision-tree induction algorithms that focus on maximizing classification accuracy and minimizing tree size (for example via reduced-error pruning). Larger trees can be better for probability estimation, even if the extra size is superfluous for accuracy maximization. We then present the results of a comprehensive set of experiments, testing some straightforward methods for improving probability-based rankings. We show that using a simple, common smoothing method—the Laplace correction—uniformly improves probability-based rankings. In addition, bagging substantially improves the rankings, and is even more effective for this purpose than for improving accuracy. We conclude that PETs, with these simple modifications, should be considered when rankings based on class-membership probability are required.
引用
收藏
页码:199 / 215
页数:16
相关论文
共 50 条
  • [1] Tree induction for probability-based ranking
    Provost, F
    Domingos, P
    MACHINE LEARNING, 2003, 52 (03) : 199 - 215
  • [2] PROBABILITY-BASED INDUCTION SOFTWARE
    JULIEN, B
    AI APPLICATIONS, 1991, 5 (04): : 79 - 80
  • [3] PROBABILITY-BASED GAME TREE PRUNING
    LI, L
    MARSLAND, TA
    JOURNAL OF ALGORITHMS, 1990, 11 (01) : 27 - 43
  • [4] EXPERIENCE WITH 4 PROBABILITY-BASED INDUCTION METHODS
    JULIEN, B
    AI APPLICATIONS, 1992, 6 (02): : 51 - 56
  • [5] Improve decision trees for probability-based ranking by lazy learners
    Liang, Han
    Yan, Yuhong
    ICTAI-2006: EIGHTEENTH INTERNATIONAL CONFERENCE ON TOOLS WITH ARTIFICIAL INTELLIGENCE, PROCEEDINGS, 2006, : 427 - +
  • [6] Probability-based approach to rectilinear Steiner tree problems
    Chen, CH
    Zhao, J
    Ahmadi, M
    IEEE TRANSACTIONS ON VERY LARGE SCALE INTEGRATION (VLSI) SYSTEMS, 2002, 10 (06) : 836 - 843
  • [7] Attribute Weighting with Probability Estimation Trees for Improving Probability-based Ranking in Liver Diagnosis
    Chu, Na
    Ma, Lizhuang
    Zhou, Min
    Hu, Yiyang
    Liu, Ping
    Che, Zhiying
    2010 IEEE INTERNATIONAL CONFERENCE ON BIOINFORMATICS AND BIOMEDICINE WORKSHOPS (BIBMW), 2010, : 733 - 739
  • [8] Probability-based antibiotherapy
    Portier, H
    PRESSE MEDICALE, 1997, 26 (01): : 18 - 19
  • [9] Contextual probability-based classification
    Guo, GD
    Hui, W
    Bell, D
    Liao, ZN
    CONCEPTUAL MODELING - ER 2004, PROCEEDINGS, 2004, 3288 : 313 - 326
  • [10] Probability-based optimal design
    Mcgree, J. M.
    Eccleston, J. A.
    AUSTRALIAN & NEW ZEALAND JOURNAL OF STATISTICS, 2008, 50 (01) : 13 - 28