A Large-Scale Ensemble Learning Framework for Demand Forecasting

被引:1
|
作者
Park, Young-Jin [1 ]
Kim, Donghyun [2 ]
Odermatt, Frederic [3 ]
Lee, Juho [4 ]
Kim, Kyung-Min [5 ,6 ]
机构
[1] MIT, Cambridge, MA 02139 USA
[2] Seoul Natl Univ, Seoul, South Korea
[3] ETH Z urich, Zurich, Switzerland
[4] Superpetual Inc, Seoul, South Korea
[5] NAVER CLOVA, Seongnam, South Korea
[6] NAVER Lab, Seongnam, South Korea
关键词
D O I
10.1109/ICDM54844.2022.00048
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Demand forecasting is a crucial component of supply chain management for revenue optimization and inventory planning. Traditional time series forecasting methods, however, have resulted in small models with limited expressive power because they have difficulty in scaling their model size up while maintaining high accuracy. In this paper, we propose Forecasting orchestra (Forchestra), a simple but powerful ensemble framework capable of accurately predicting future demand for a diverse range of items. Forchestra consists of two parts: 1) base predictors and 2) a neural conductor. For a given time series, each base predictor outputs its respective forecast based on historical observations. On top of the base predictors, the neural conductor adaptively assigns the importance weight for each predictor by looking at the representation vector provided by a representation module. Finally, Forchestra aggregates the predictions by the weights and constructs a final prediction. In contrast to previous ensemble approaches, the neural conductor and all base predictors of Forchestra are trained in an endto-end manner; this allows each base predictor to modify its reaction to different inputs, while supporting other predictors and constructing a final prediction jointly. We empirically show that the model size is scalable to up to 0.8 billion parameters (approximate to 400-layer LSTM). The proposed method is evaluated on our proprietary E-Commerce (100K) and the public M5 (30K) datasets, and it outperforms existing forecasting models with a significant margin. In addition, we observe that our framework generalizes well to unseen data points when evaluated in a zeroshot fashion on downstream datasets. Last but not least, we present extensive qualitative and quantitative studies to analyze how the proposed model outperforms baseline models and differs from conventional ensemble approaches. The code is available at https://github.com/young-j-park/22-ICDM-Forchestra.
引用
收藏
页码:378 / 387
页数:10
相关论文
共 50 条
  • [21] Ensemble learning framework for forecasting construction costs
    Habib, Omar
    Abouhamad, Mona
    Bayoumi, Abdelmoniem
    AUTOMATION IN CONSTRUCTION, 2025, 170
  • [22] Tourism demand forecasting: An ensemble deep learning approach
    Sun, Shaolong
    Li, Yanzhao
    Guo, Ju-e
    Wang, Shouyang
    TOURISM ECONOMICS, 2022, 28 (08) : 2021 - 2049
  • [23] Precipitation forecasting by large-scale climate indices and machine learning techniques
    Rostam, Mehdi Gholami
    Sadatinejad, Seyyed Javad
    Malekian, Arash
    JOURNAL OF ARID LAND, 2020, 12 (05) : 854 - 864
  • [24] Precipitation forecasting by large-scale climate indices and machine learning techniques
    Mehdi Gholami Rostam
    Seyyed Javad Sadatinejad
    Arash Malekian
    Journal of Arid Land, 2020, 12 : 854 - 864
  • [25] Precipitation forecasting by large-scale climate indices and machine learning techniques
    Mehdi GHOLAMI ROSTAM
    Seyyed Javad SADATINEJAD
    Arash MALEKIAN
    Journal of Arid Land, 2020, 12 (05) : 854 - 864
  • [26] An Ensemble Stochastic Forecasting Framework for Variable Distributed Demand Loads
    Agyeman, Kofi Afrifa
    Kim, Gyeonggak
    Jo, Hoonyeon
    Park, Seunghyeon
    Han, Sekyung
    ENERGIES, 2020, 13 (10)
  • [27] Forecasting tourism demand with a novel robust decomposition and ensemble framework
    Li, Xin
    Zhang, Xu
    Zhang, Chengyuan
    Wang, Shouyang
    EXPERT SYSTEMS WITH APPLICATIONS, 2024, 236
  • [28] A Framework of Large-Scale Peer-to-Peer Learning System
    Luo, Yongkang
    Han, Peiyi
    Luo, Wenjian
    Xue, Shaocong
    Chen, Kesheng
    Song, Linqi
    NEURAL INFORMATION PROCESSING, ICONIP 2023, PT II, 2024, 14448 : 27 - 41
  • [29] A large-scale graph learning framework of technological gatekeepers by MapReduce
    School of Economics and Management, Beihang University, Beijing, China
    不详
    Proc. IEEE Int. Parallel Distrib. Process. Symp. Workshops, IPDPSW, (1997-2003):
  • [30] Operation optimization in large-scale heat pump systems: A scheduling framework integrating digital twin modelling, demand forecasting, and MILP
    Aguilera, Jose Joaquin
    Padulles, Roger
    Meesenburg, Wiebke
    Markussen, Wiebke Brix
    Zuehlsdorf, Benjamin
    Elmegaard, Brian
    APPLIED ENERGY, 2024, 376