A Large-Scale Ensemble Learning Framework for Demand Forecasting

被引:1
|
作者
Park, Young-Jin [1 ]
Kim, Donghyun [2 ]
Odermatt, Frederic [3 ]
Lee, Juho [4 ]
Kim, Kyung-Min [5 ,6 ]
机构
[1] MIT, Cambridge, MA 02139 USA
[2] Seoul Natl Univ, Seoul, South Korea
[3] ETH Z urich, Zurich, Switzerland
[4] Superpetual Inc, Seoul, South Korea
[5] NAVER CLOVA, Seongnam, South Korea
[6] NAVER Lab, Seongnam, South Korea
关键词
D O I
10.1109/ICDM54844.2022.00048
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Demand forecasting is a crucial component of supply chain management for revenue optimization and inventory planning. Traditional time series forecasting methods, however, have resulted in small models with limited expressive power because they have difficulty in scaling their model size up while maintaining high accuracy. In this paper, we propose Forecasting orchestra (Forchestra), a simple but powerful ensemble framework capable of accurately predicting future demand for a diverse range of items. Forchestra consists of two parts: 1) base predictors and 2) a neural conductor. For a given time series, each base predictor outputs its respective forecast based on historical observations. On top of the base predictors, the neural conductor adaptively assigns the importance weight for each predictor by looking at the representation vector provided by a representation module. Finally, Forchestra aggregates the predictions by the weights and constructs a final prediction. In contrast to previous ensemble approaches, the neural conductor and all base predictors of Forchestra are trained in an endto-end manner; this allows each base predictor to modify its reaction to different inputs, while supporting other predictors and constructing a final prediction jointly. We empirically show that the model size is scalable to up to 0.8 billion parameters (approximate to 400-layer LSTM). The proposed method is evaluated on our proprietary E-Commerce (100K) and the public M5 (30K) datasets, and it outperforms existing forecasting models with a significant margin. In addition, we observe that our framework generalizes well to unseen data points when evaluated in a zeroshot fashion on downstream datasets. Last but not least, we present extensive qualitative and quantitative studies to analyze how the proposed model outperforms baseline models and differs from conventional ensemble approaches. The code is available at https://github.com/young-j-park/22-ICDM-Forchestra.
引用
收藏
页码:378 / 387
页数:10
相关论文
共 50 条
  • [11] The station-free sharing bike demand forecasting with a deep learning approach and large-scale datasets
    Xu, Chengcheng
    Ji, Junyi
    Liu, Pan
    TRANSPORTATION RESEARCH PART C-EMERGING TECHNOLOGIES, 2018, 95 : 47 - 60
  • [12] A LANGEVINIZED ENSEMBLE KALMAN FILTER FOR LARGE-SCALE DYNAMIC LEARNING
    Zhang, Peiyi
    Song, Qifan
    Liang, Faming
    STATISTICA SINICA, 2024, 34 : 1071 - 1091
  • [13] Large-scale multi-label ensemble learning on Spark
    Gonzalez-Lopez, Jorge
    Cano, Alberto
    Ventura, Sebastian
    2017 16TH IEEE INTERNATIONAL CONFERENCE ON TRUST, SECURITY AND PRIVACY IN COMPUTING AND COMMUNICATIONS / 11TH IEEE INTERNATIONAL CONFERENCE ON BIG DATA SCIENCE AND ENGINEERING / 14TH IEEE INTERNATIONAL CONFERENCE ON EMBEDDED SOFTWARE AND SYSTEMS, 2017, : 893 - 900
  • [14] An elastic framework for ensemble-based large-scale data assimilation
    Friedemann, Sebastian
    Raffin, Bruno
    INTERNATIONAL JOURNAL OF HIGH PERFORMANCE COMPUTING APPLICATIONS, 2022, 36 (04): : 543 - 563
  • [15] An Incremental Learning framework for Large-scale CTR Prediction
    Katsileros, Petros
    Mandilaras, Nikiforos
    Mallis, Dimitrios
    Pitsikalis, Vassilis
    Theodorakis, Stavros
    Chamiel, Gil
    PROCEEDINGS OF THE 16TH ACM CONFERENCE ON RECOMMENDER SYSTEMS, RECSYS 2022, 2022, : 490 - 493
  • [16] Ensemble Approach for Time Series Analysis in Demand Forecasting Ensemble Learning
    Akyuz, A. Okay
    Bulbul, Berna Atak
    Uysal, Mitat
    Uysal, M. Ozan
    2017 IEEE INTERNATIONAL CONFERENCE ON INNOVATIONS IN INTELLIGENT SYSTEMS AND APPLICATIONS (INISTA), 2017, : 7 - 12
  • [17] StackBRAF: A Large-Scale Stacking Ensemble Learning for BRAF Affinity Prediction
    Syahid, Nur Fadhilah
    Weerapreeyakul, Natthida
    Srisongkram, Tarapong
    ACS OMEGA, 2023, 8 (23): : 20881 - 20891
  • [18] Effective ensemble learning approach for large-scale medical data analytics
    Namamula, Lakshmana Rao
    Chaytor, Daniel
    INTERNATIONAL JOURNAL OF SYSTEM ASSURANCE ENGINEERING AND MANAGEMENT, 2024, 15 (01) : 13 - 20
  • [19] Effective ensemble learning approach for large-scale medical data analytics
    Lakshmana Rao Namamula
    Daniel Chaytor
    International Journal of System Assurance Engineering and Management, 2024, 15 : 13 - 20
  • [20] An Ensemble Learning Platform for the Large-Scale Exploration of New Double Perovskites
    Wang, Zhilong
    Han, Yanqiang
    Lin, Xirong
    Cai, Junfei
    Wu, Sicheng
    Li, Jinjin
    ACS APPLIED MATERIALS & INTERFACES, 2022, 14 (01) : 717 - 725