An improved random forest based on the classification accuracy and correlation measurement of decision trees

被引:105
|
作者
Sun, Zhigang [1 ,2 ,4 ,5 ]
Wang, Guotao [1 ,2 ,4 ,5 ]
Li, Pengfei [2 ,4 ,5 ]
Wang, Hui [3 ]
Zhang, Min [1 ]
Liang, Xiaowen [1 ]
机构
[1] Heilongjiang Univ, Sch Elect & Elect Engn, Harbin 150080, Peoples R China
[2] Harbin Inst Technol, Sch Elect Engn & Automat, Harbin 150001, Peoples R China
[3] Yangzhou Univ, Sch Hydraul Sci & Engn, Yangzhou 225009, Peoples R China
[4] Key Lab Elect & Elect Reliabil Technol Heilongjian, Harbin 150001, Peoples R China
[5] MOE Key Lab Reliabil & Qual Consistency Elect Comp, Harbin 150001, Peoples R China
关键词
Classification accuracy; Correlation measurement; Dot product; Random forest; CART; DISTANCE;
D O I
10.1016/j.eswa.2023.121549
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Random forest is one of the most widely used machine learning algorithms. Decision trees used to construct the random forest may have low classification accuracies or high correlations, which affects the comprehensive performance of the random forest. Aiming at these problems, the authors proposed an improved random forest based on the classification accuracy and correlation measurement of decision trees in this paper. Its main idea includes two parts, one is retaining the classification and regression trees (CARTs) with better classification effects, the other is reducing the correlations between the CARTs. Specifically, in the classification effect evaluation part, each CART was applied to make predictions on three reserved data sets, then the average classifi-cation accuracies were achieved, respectively. Thus, all the CARTs were sorted in descending order according to their achieved average classification accuracies. In the correlation measurement part, the improved dot product method was proposed to calculate the cosine similarity, i.e., the correlation, between CARTs in the feature space. By using the achieved average classification accuracy as reference, the grid search method was used to find the inner product threshold. On this basis, the CARTs with low average classification accuracy among CART pairs whose inner product values are higher than the inner product threshold were marked as deletable. The achieved average classification accuracies and correlations of CARTs were comprehensively considered, those with high correlation and weak classification effect were deleted, and those with better quality were retained to construct the random forest. Multiple experiments show that, the proposed improved random forest achieved higher average classification accuracy than the five random forests used for comparison, and the lead was stable. The G-means and out-of-bag data (OBD) score obtained by the proposed improved random forest were also higher than the five random forests, and the lead was more obvious. In addition, the test results of three non-parametric tests show that, there were significant diversities between the proposed improved random forest and the other five random forests. This effectively proves the superiority and practicability of the proposed improved random forest.
引用
收藏
页数:19
相关论文
共 50 条
  • [32] A labeling algorithm based on a forest of decision trees
    T. Chabardès
    P. Dokládal
    M. Bilodeau
    Journal of Real-Time Image Processing, 2020, 17 : 1527 - 1545
  • [33] A labeling algorithm based on a forest of decision trees
    Chabardes, T.
    Dokladal, P.
    Bilodeau, M.
    JOURNAL OF REAL-TIME IMAGE PROCESSING, 2020, 17 (05) : 1527 - 1545
  • [34] A new random forest method based on belief decision trees and its application in intention estimation
    Li, Xinyu
    Li, Mingda
    Zhang, Yu
    Deng, Xinyang
    PROCEEDINGS OF THE 33RD CHINESE CONTROL AND DECISION CONFERENCE (CCDC 2021), 2021, : 6008 - 6012
  • [35] Illuminant Classification based on Random Forest
    Liu, Bozhi
    Qiu, Guoping
    2015 14TH IAPR INTERNATIONAL CONFERENCE ON MACHINE VISION APPLICATIONS (MVA), 2015, : 106 - 109
  • [36] An automatically recursive feature elimination method based on threshold decision in random forest classification
    Chen, Chao
    Liang, Jintao
    Sun, Weiwei
    Yang, Gang
    Meng, Xiangchao
    GEO-SPATIAL INFORMATION SCIENCE, 2024,
  • [37] Intelligent decision model of road maintenance based on improved weight random forest algorithm
    Han, Chengjia
    Ma, Tao
    Xu, Guangji
    Chen, Siyu
    Huang, Ruoyun
    INTERNATIONAL JOURNAL OF PAVEMENT ENGINEERING, 2022, 23 (04) : 985 - 997
  • [38] Wrong-Lane Accidents Detection using Random Forest Algorithm in comparison with Decision Tree for Improved Accuracy
    Pradyumna, B.
    Nagaraju, V.
    JOURNAL OF PHARMACEUTICAL NEGATIVE RESULTS, 2022, 13 : 540 - 547
  • [39] Improved Accuracy of Calculation of Vehicle Crash Severity in Highways using Random Forest over Decision Tree Algorithm
    Vignesh, S.
    Sashi, Rekha K.
    JOURNAL OF PHARMACEUTICAL NEGATIVE RESULTS, 2022, 13 : 1471 - 1478
  • [40] Component-based decision trees for classification
    Delibasic, Boris
    Jovanovic, Milos
    Vukicevic, Milan
    Suknovic, Milija
    Obradovic, Zoran
    INTELLIGENT DATA ANALYSIS, 2011, 15 (05) : 671 - 693