Ensemble learning prediction model for rapeseed flowering periods incorporating virtual sample generation

被引:0
|
作者
Xie, Qianwei [1 ]
Xue, Fengchang [1 ]
Chen, Jianfei [2 ]
机构
[1] Meteorological Disaster Geographic Information Engineering Laboratory, Nanjing University of Information Science & Technology, Nanjing,210044, China
[2] Guangxi Zhuang Autonomous Region Lightning Protection Center, Nanning,530000, China
关键词
Decision trees;
D O I
10.11975/j.issn.1002-6819.202404106
中图分类号
学科分类号
摘要
Linear regression cannot fully reveal the complex non-linear relationships among influencing factors and scarce samples in the flowering period. In this study, ensemble learning was proposed to predict the flowering periods of rapeseed. The generation of virtual samples was also incorporated. The rapeseed in full bloom and meteorological data was utilized in Longyou County, Quzhou City, Zhejiang Province, China from 1998 to 2023. The original samples were expanded using Gaussian Mixture Model-based Virtual Sample Generation and Cubic Spline Interpolation. Two new datasets were obtained, each of which contained 985 samples. The models were established using eight machine learning methods: Random Forest (RF), Kernel Ridge Regression (KRR), Ridge Regression (RR), Least Absolute Shrinkage and Selection Operator (Lasso), Support Vector Regression (SVR), Extreme Gradient Boosting (XGBoost), Light Gradient Boosting Machine (LightGBM), and Gradient Boosting Decision Tree (GBDT). Hyperparameter optimization was conducted using a Bayesian optimizer. Finally, a prediction model was established for the rapeseed flowering period using stacking ensemble learning. The vast majority of models demonstrated superior performance on the Cubic interpolation dataset, compared with the original and GMM-VSG dataset. Specifically, the RF model was achieved in an RMSE of 0.679 d, an MAE of 0.351 d, and an R2 of 0.990, indicating significant improvements, compared with the original dataset with an RMSE of 6.286 d, an MAE of 5.028 d, and an R2 of 0.201, as well as the GMM-VSG dataset with an RMSE of 2.680 d, an MAE of 1.588 d, and an R2 of 0.881. Additionally, the SVR model also performed better on the Cubic dataset, with an RMSE of 0.849 d, an MAE of 0.333 d, and an R2 of 0.984, indicating a better performance than before. LightGBM as an ensemble learning was performed the best on the Cubic dataset, with the lowest RMSE of 0.613 d MAE of 0.336 d, and the highest R2 of 0.992. The strong feature learning and noise resistance were verified to capture the complex relationships within the dataset. In contrast, there was no significant improvement of Lasso and RR models on the Cubic dataset. For instance, Lasso exhibited an RMSE of 3.879 d and an MAE of 3.054 d on the Cubic dataset. There was a relative decrease in the error, compared with the original RMSE of 6.329 d and MAE of 5.567 d. There was a substantial gap relative to other models. Five models were developed using the Stacking ensemble learning approach: SRX_L, All_L, SLL_L, SRL_L, and SRK_L. Among them, the SRX_L model performed the best across various metrics. The highest R2 value of 0.999 7 was achieved with the lowest RMSE and MAE values among all models, at 0.122 7 d and 0.105 6 d, respectively. There was a general consistency in the actual and predicted flowering trends, in terms of the fitting flowering period. The high predictive accuracy was also obtained over most years, particularly in 2001, 2011, and 2014. Among them, the prediction closely matched the actual data with minimal discrepancies, sometimes less than 0.01 or even approaching zero. However, there were some years with the larger differences, such as 1999 and 2023. Particularly, the year 1999 experienced the largest discrepancy, where the error was 0.442 1 d. The maximum actual flowering period occurred in 2005, reaching 92 days, with an error between the predicted and actual values of 0.041 6 d. The minimum actual flowering period was observed in 2020, at 63 days, with an error between the predicted and actual values of 0.132 5 d. Therefore, the model can be expected to highly accurately predict the extreme values. The virtual sample generation can also be suitable for small datasets. The predictive accuracy and generalizability of the improved model were significantly enhanced to reduce the costs and challenges of data collection. Compared with single machine learning, Stacking ensemble learning can substantially improve the predictive performance. Stacking ensemble learning is well-suited to complex tasks with nonlinear relationships, such as the flowering periods of rapeseed. © 2024 Chinese Society of Agricultural Engineering. All rights reserved.
引用
收藏
页码:159 / 167
相关论文
共 50 条
  • [21] Dynamic prediction of landslide life expectancy using ensemble system incorporating classical prediction models and machine learning
    Liu, Lei-Lei
    Yin, Hao-Dong
    Xiao, Ting
    Huang, Lei
    Cheng, Yung-Ming
    GEOSCIENCE FRONTIERS, 2024, 15 (02)
  • [22] Prediction Model for a Good Learning Environment Using an Ensemble Approach
    Subha, S.
    Priya, S. Baghavathi
    COMPUTER SYSTEMS SCIENCE AND ENGINEERING, 2023, 44 (03): : 2081 - 2093
  • [23] Early Diabetes Prediction Based on Stacking Ensemble Learning Model
    Liu, JiMin
    Fan, LuHao
    Jia, QuanQiu
    Wen, LongRi
    Shi, ChengFeng
    PROCEEDINGS OF THE 33RD CHINESE CONTROL AND DECISION CONFERENCE (CCDC 2021), 2021, : 2687 - 2692
  • [24] Optimized ensemble machine learning model for software bugs prediction
    Femi Johnson
    Olayiwola Oluwatobi
    Olusegun Folorunso
    Alomaja Victor Ojumu
    Alatishe Quadri
    Innovations in Systems and Software Engineering, 2023, 19 : 91 - 101
  • [25] Employee Turnover Prediction Based on Ensemble Learning DGNK Model
    Ma, Lihe
    Wang, Kechao
    Wang, Yan
    Liu, Lin
    Sha, Ning
    Ma, Lin
    JOURNAL OF ELECTRICAL SYSTEMS, 2024, 20 (02) : 182 - 189
  • [26] Ensemble Deep Learning Network Model for Dropout Prediction in MOOCs
    Kumar, Gaurav
    Singh, Amar
    Sharma, Ashok
    INTERNATIONAL JOURNAL OF ELECTRICAL AND COMPUTER ENGINEERING SYSTEMS, 2023, 14 (02) : 187 - 196
  • [27] Optimized ensemble machine learning model for software bugs prediction
    Johnson, Femi
    Oluwatobi, Olayiwola
    Folorunso, Olusegun
    Ojumu, Alomaja Victor
    Quadri, Alatishe
    INNOVATIONS IN SYSTEMS AND SOFTWARE ENGINEERING, 2023, 19 (01) : 91 - 101
  • [28] CROP PRODUCTION-ENSEMBLE MACHINE LEARNING MODEL FOR PREDICTION
    Kumar, N. Naveen
    Mohanraj, P.
    Priyatharsini, S.
    Shakthi, S. P.
    Sivakumar, S.
    INTERNATIONAL JOURNAL OF EARLY CHILDHOOD SPECIAL EDUCATION, 2022, 14 (04) : 391 - 400
  • [29] A prediction model of airport noise based on the dynamic ensemble learning
    Yang, Q.-C. (qichuan171@163.com), 1631, Science Press (36):
  • [30] EnsembleSplice: ensemble deep learning model for splice site prediction
    Victor Akpokiro
    Trevor Martin
    Oluwatosin Oluwadare
    BMC Bioinformatics, 23