Comparison of Decision Tree Classification Methods and Gradient Boosted Trees

被引:1
|
作者
Dikananda, Arif Rinaldi [1 ]
Jumini, Sri [2 ]
Tarihoran, Nafan [3 ]
Christinawati, Santy [4 ]
Trimastuti, Wahyu [4 ]
Rahim, Robbi [5 ]
机构
[1] ISTMIK IKMI, Cirebon, Indonesia
[2] Univ Sains Al Quran Indonesia, Wonosobo, Indonesia
[3] Univ Islam Negeri Sutan Maulana Hasanuddi, Serang, Indonesia
[4] Politekn Piksi Ganesha Bandung, Bandung, Indonesia
[5] Sekolah Tinggi Ilmu Manajemen Sukma, Medan, Indonesia
关键词
Comparison; Data mining; Classification; C4.5; Random Forest; Accuracy; RANDOM FOREST; ALGORITHM;
D O I
10.18421/TEM111-39
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The purpose of this research is to analyze the C4.5 and Random Forest algorithms for classification. The two methods were compared to see which one in the classification process was more accurate. The case is the success of university students at one of the private universities. Data is obtained from the https://osf.io/jk2ac data set. The attributes used were gender, student, average evaluation (NEM), reading session, school origin, and presence as input and success as a result (label). The process of analysis uses Rapid Miner software with the same test parameters (k-folds = 2, 3, 4, 5) with the same type of sample (stratified sample, linear sample, shuffled sampling). The first result shows that the sample type test k-fold (stratified sampling) achieved an average accuracy of 55.76 percent (C4,5) and 5618 percent (Random Forest). The second result showed that the k-fold (linear sampling) sample test achieved an average precision of 58.06 percent (C4.5) and 6506 percent. (Random Forest). The third result shows that the k-fold test with the sampling type has averaged 58.68 per cent (C4,5) and 60,76 per cent (shuffled sampling) precision (Random Forest). From the three test results, in the case of student success at a private university, the Random Forest method is better than C4.5.
引用
收藏
页码:316 / 322
页数:7
相关论文
共 50 条
  • [21] Comparison of gradient boosted decision trees and random forest for groundwater potential mapping in Dholpur (Rajasthan), India
    Shruti Sachdeva
    Bijendra Kumar
    [J]. Stochastic Environmental Research and Risk Assessment, 2021, 35 : 287 - 306
  • [22] Gradient boosted decision trees reveal nuances of auditory discrimination behavior
    Griffiths, Carla S.
    Lebert, Jules M.
    Sollini, Joseph
    Bizley, Jennifer K.
    [J]. PLOS COMPUTATIONAL BIOLOGY, 2024, 20 (04)
  • [23] Classification of ECG beats using optimized decision tree and adaptive boosted optimized decision tree
    Mohebbanaaz
    Kumari, L. V. Rajani
    Sai, Y. Padma
    [J]. SIGNAL IMAGE AND VIDEO PROCESSING, 2022, 16 (03) : 695 - 703
  • [24] Classification of ECG beats using optimized decision tree and adaptive boosted optimized decision tree
    L. V. Rajani Mohebbanaaz
    Y. Padma Kumari
    [J]. Signal, Image and Video Processing, 2022, 16 : 695 - 703
  • [25] Regression-based classification methods and their comparison with decision tree algorithms
    Kiselev, MV
    Ananyan, SM
    Arseniev, SB
    [J]. PRINCIPLES OF DATA MINING AND KNOWLEDGE DISCOVERY, 1997, 1263 : 134 - 144
  • [26] A novel diabetic retinopathy classification scheme based on compact bilinear pooling CNN and gradient boosted decision tree
    [J]. Liang, Yi-Xiong (yxliang@csu.edu.cn), 2018, Ubiquitous International (09):
  • [27] Low energy event classification in IceCube using boosted decision trees
    DeHolton, K. Leonard
    [J]. JOURNAL OF INSTRUMENTATION, 2021, 16 (12)
  • [28] Classification of Time-Series Data Using Boosted Decision Trees
    Aasi, Erfan
    Vasile, Cristian Ioan
    Bahreinian, Mahroo
    Belta, Calin
    [J]. 2022 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2022, : 1263 - 1268
  • [29] Reweighting with Boosted Decision Trees
    Rogozhnikov, Alex
    [J]. 17TH INTERNATIONAL WORKSHOP ON ADVANCED COMPUTING AND ANALYSIS TECHNIQUES IN PHYSICS RESEARCH (ACAT2016), 2016, 762
  • [30] Boosted Decision Trees and Applications
    Coadou, Yann
    [J]. SOS 2012 - IN2P3 SCHOOL OF STATISTICS, 2013, 55