A neural network boosting regression model based on XGBoost

被引:0
|
作者
Dong, Jianwei [1 ]
Chen, Yumin [1 ]
Yao, Bingyu [1 ]
Zhang, Xiao [1 ]
Zeng, Nianfeng [2 ]
机构
[1] Xiamen Univ Technol, Coll Comp & Informat Engn, Xiamen 361024, Fujian, Peoples R China
[2] E Success Informat Technol Co Ltd, Xiamen 361024, Fujian, Peoples R China
关键词
Ensemble learning; Neural networks; Boosting; Regression model; Deep learning;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The boosting model is a kind of ensemble learning technology, including XGBoost and GBDT, which take decision trees as weak classifiers and achieve better results in classification and regression problems. The neural network has an excellent performance on image and voice recognition, but its weak interpretability limits on developing a fusion model. By referring to principles and methods of traditional boosting models, we proposed a Neural Network Boosting (NNBoost) regression, which takes shallow neural networks with simple structures as weak classifiers. The NNBoost is a new ensemble learning method, which obtains low regression errors on several data sets. The target loss function of NNBoost is approximated by the Taylor expansion. By inducing the derivative form of NNBoost, we give a gradient descent algorithm. The structure of deep learning is complex, and there are some problems such as gradient disappearing, weak interpretability, and parameters difficult to be adjusted. We use the integration of simple neural networks to alleviate the gradient vanishing problem which is laborious to be solved in deep learning, and conquer the overfitting of a learning algorithm. Finally, through testing on some experiments, the correctness and effectiveness of NNBoost are verified from multiple angles, the effect of multiple shallow neural network fusion is proved, and the development path of boosting idea and deep learning is widened to a certain extent.(C) 2022 Elsevier B.V. All rights reserved.
引用
收藏
页数:11
相关论文
共 50 条
  • [1] A neural network boosting regression model based on XGBoost
    Dong, Jianwei
    Chen, Yumin
    Yao, Bingyu
    Zhang, Xiao
    Zeng, Nianfeng
    [J]. APPLIED SOFT COMPUTING, 2022, 125
  • [2] Boosting feature selection for Neural Network based regression
    Bailly, Kevin
    Milgram, Maurice
    [J]. NEURAL NETWORKS, 2009, 22 (5-6) : 748 - 756
  • [3] A modified boosting based neural network ensemble method for regression and forecasting
    Wang, Li
    Zhu, Xuefeng
    [J]. ICIEA 2007: 2ND IEEE CONFERENCE ON INDUSTRIAL ELECTRONICS AND APPLICATIONS, VOLS 1-4, PROCEEDINGS, 2007, : 1280 - 1285
  • [4] Regression Model for Appraisal of Real Estate Using Recurrent Neural Network and Boosting Tree
    Bin, Junchi
    Tang, Shiyuan
    Liu, Yihao
    Wang, Gang
    Gardiner, Bryan
    Liu, Zheng
    Li, Eric
    [J]. 2017 2ND IEEE INTERNATIONAL CONFERENCE ON COMPUTATIONAL INTELLIGENCE AND APPLICATIONS (ICCIA), 2017, : 209 - 213
  • [5] Neural Network Model Synthesis Based on a Regression Tree
    S. Subbotin
    [J]. Automatic Control and Computer Sciences, 2020, 54 : 313 - 322
  • [6] Neural Network Model Synthesis Based on a Regression Tree
    Subbotin, S.
    [J]. AUTOMATIC CONTROL AND COMPUTER SCIENCES, 2020, 54 (04) : 313 - 322
  • [7] Neural network-based transductive regression model
    Ohno, Hiroshi
    [J]. APPLIED SOFT COMPUTING, 2019, 84
  • [8] Fall Detection Based on Convolutional Neural Network and XGBoost
    Zhao Xinchi
    Hu Anming
    He Wei
    [J]. LASER & OPTOELECTRONICS PROGRESS, 2020, 57 (16)
  • [9] A multimodal visual fatigue assessment model based on back propagation neural network and XGBoost
    Jia, Lixiu
    Jia, Lixin
    Zhao, Jian
    Feng, Lihang
    Huang, Xiaohua
    [J]. DISPLAYS, 2024, 83
  • [10] A Novel Channel Model Based on the General Regression Neural Network
    Sun, Yanping
    Wang, Hongsheng
    Wang, Zhongdao
    [J]. 2015 4TH INTERNATIONAL CONFERENCE ON ENERGY AND ENVIRONMENTAL PROTECTION (ICEEP 2015), 2015, : 574 - 578