An improved random forest based on the classification accuracy and correlation measurement of decision trees

被引:105
|
作者
Sun, Zhigang [1 ,2 ,4 ,5 ]
Wang, Guotao [1 ,2 ,4 ,5 ]
Li, Pengfei [2 ,4 ,5 ]
Wang, Hui [3 ]
Zhang, Min [1 ]
Liang, Xiaowen [1 ]
机构
[1] Heilongjiang Univ, Sch Elect & Elect Engn, Harbin 150080, Peoples R China
[2] Harbin Inst Technol, Sch Elect Engn & Automat, Harbin 150001, Peoples R China
[3] Yangzhou Univ, Sch Hydraul Sci & Engn, Yangzhou 225009, Peoples R China
[4] Key Lab Elect & Elect Reliabil Technol Heilongjian, Harbin 150001, Peoples R China
[5] MOE Key Lab Reliabil & Qual Consistency Elect Comp, Harbin 150001, Peoples R China
关键词
Classification accuracy; Correlation measurement; Dot product; Random forest; CART; DISTANCE;
D O I
10.1016/j.eswa.2023.121549
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Random forest is one of the most widely used machine learning algorithms. Decision trees used to construct the random forest may have low classification accuracies or high correlations, which affects the comprehensive performance of the random forest. Aiming at these problems, the authors proposed an improved random forest based on the classification accuracy and correlation measurement of decision trees in this paper. Its main idea includes two parts, one is retaining the classification and regression trees (CARTs) with better classification effects, the other is reducing the correlations between the CARTs. Specifically, in the classification effect evaluation part, each CART was applied to make predictions on three reserved data sets, then the average classifi-cation accuracies were achieved, respectively. Thus, all the CARTs were sorted in descending order according to their achieved average classification accuracies. In the correlation measurement part, the improved dot product method was proposed to calculate the cosine similarity, i.e., the correlation, between CARTs in the feature space. By using the achieved average classification accuracy as reference, the grid search method was used to find the inner product threshold. On this basis, the CARTs with low average classification accuracy among CART pairs whose inner product values are higher than the inner product threshold were marked as deletable. The achieved average classification accuracies and correlations of CARTs were comprehensively considered, those with high correlation and weak classification effect were deleted, and those with better quality were retained to construct the random forest. Multiple experiments show that, the proposed improved random forest achieved higher average classification accuracy than the five random forests used for comparison, and the lead was stable. The G-means and out-of-bag data (OBD) score obtained by the proposed improved random forest were also higher than the five random forests, and the lead was more obvious. In addition, the test results of three non-parametric tests show that, there were significant diversities between the proposed improved random forest and the other five random forests. This effectively proves the superiority and practicability of the proposed improved random forest.
引用
收藏
页数:19
相关论文
共 50 条
  • [21] A Novel Enhanced Random Forest for Medical Data Classification using Correlation Pearson and Best Number of Trees
    Tarchoune I.
    Djebbar A.
    Merouani H.F.
    Rania H.
    Journal of Computing Science and Engineering, 2024, 18 (01) : 57 - 68
  • [22] Fuzzy Random Forest with C-Fuzzy Decision Trees
    Gadomer, Lukasz
    Sosnowski, Zenon A.
    COMPUTER INFORMATION SYSTEMS AND INDUSTRIAL MANAGEMENT, CISIM 2016, 2016, 9842 : 481 - 492
  • [23] A New Random Forest Ensemble of Intuitionistic Fuzzy Decision Trees
    Ren, Yingtao
    Zhu, Xiaomin
    Bai, Kaiyuan
    Zhang, Runtong
    IEEE TRANSACTIONS ON FUZZY SYSTEMS, 2023, 31 (05) : 1729 - 1741
  • [24] Disease Classification Based on Eye Movement Features With Decision Tree and Random Forest
    Mao, Yuxing
    He, Yinghong
    Liu, Lumei
    Chen, Xueshuo
    FRONTIERS IN NEUROSCIENCE, 2020, 14
  • [25] Classification based on full decision trees
    Genrikhov, I. E.
    Djukova, E. V.
    COMPUTATIONAL MATHEMATICS AND MATHEMATICAL PHYSICS, 2012, 52 (04) : 653 - 663
  • [26] Classification based on full decision trees
    I. E. Genrikhov
    E. V. Djukova
    Computational Mathematics and Mathematical Physics, 2012, 52 : 653 - 663
  • [27] Simulation of English Speech Recognition Based on Improved Extreme Random Forest Classification
    Hao, Chunhui
    Li, Yuan
    Computational Intelligence and Neuroscience, 2022, 2022
  • [28] Random forest -based nonlinear improved feature extraction and selection for fault classification
    Fezai, Radhia
    Bouzrara, Kais
    Mansouri, Majdi
    Nounou, Hazem
    Nounou, Mohamed
    Trabelsi, Mohamed
    2021 18TH INTERNATIONAL MULTI-CONFERENCE ON SYSTEMS, SIGNALS & DEVICES (SSD), 2021, : 601 - 606
  • [29] Simulation of English Speech Recognition Based on Improved Extreme Random Forest Classification
    Hao, Chunhui
    Li, Yuan
    COMPUTATIONAL INTELLIGENCE AND NEUROSCIENCE, 2022, 2022
  • [30] MapReduce based distributed improved random forest model for graduates career classification
    Qiao F.
    Ge Y.
    Kong W.
    1600, Systems Engineering Society of China (37): : 1383 - 1392