A Large-Scale Ensemble Learning Framework for Demand Forecasting

被引:1
|
作者
Park, Young-Jin [1 ]
Kim, Donghyun [2 ]
Odermatt, Frederic [3 ]
Lee, Juho [4 ]
Kim, Kyung-Min [5 ,6 ]
机构
[1] MIT, Cambridge, MA 02139 USA
[2] Seoul Natl Univ, Seoul, South Korea
[3] ETH Z urich, Zurich, Switzerland
[4] Superpetual Inc, Seoul, South Korea
[5] NAVER CLOVA, Seongnam, South Korea
[6] NAVER Lab, Seongnam, South Korea
关键词
D O I
10.1109/ICDM54844.2022.00048
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Demand forecasting is a crucial component of supply chain management for revenue optimization and inventory planning. Traditional time series forecasting methods, however, have resulted in small models with limited expressive power because they have difficulty in scaling their model size up while maintaining high accuracy. In this paper, we propose Forecasting orchestra (Forchestra), a simple but powerful ensemble framework capable of accurately predicting future demand for a diverse range of items. Forchestra consists of two parts: 1) base predictors and 2) a neural conductor. For a given time series, each base predictor outputs its respective forecast based on historical observations. On top of the base predictors, the neural conductor adaptively assigns the importance weight for each predictor by looking at the representation vector provided by a representation module. Finally, Forchestra aggregates the predictions by the weights and constructs a final prediction. In contrast to previous ensemble approaches, the neural conductor and all base predictors of Forchestra are trained in an endto-end manner; this allows each base predictor to modify its reaction to different inputs, while supporting other predictors and constructing a final prediction jointly. We empirically show that the model size is scalable to up to 0.8 billion parameters (approximate to 400-layer LSTM). The proposed method is evaluated on our proprietary E-Commerce (100K) and the public M5 (30K) datasets, and it outperforms existing forecasting models with a significant margin. In addition, we observe that our framework generalizes well to unseen data points when evaluated in a zeroshot fashion on downstream datasets. Last but not least, we present extensive qualitative and quantitative studies to analyze how the proposed model outperforms baseline models and differs from conventional ensemble approaches. The code is available at https://github.com/young-j-park/22-ICDM-Forchestra.
引用
收藏
页码:378 / 387
页数:10
相关论文
共 50 条
  • [41] Two-stage based ensemble optimization framework for large-scale global optimization
    Wang, Yu
    Huang, Jin
    Dong, Wei Shan
    Yan, Jun Chi
    Tian, Chun Hua
    Li, Min
    Mo, Wen Ting
    EUROPEAN JOURNAL OF OPERATIONAL RESEARCH, 2013, 228 (02) : 308 - 320
  • [42] Water and energy demand forecasting in large-scale water distribution networks for irrigation using open data and machine learning algorithms
    Gonzalez Perea, Rafael
    Ballesteros, Rocio
    Ortega, Jose F.
    Angel Moreno, Miguel
    COMPUTERS AND ELECTRONICS IN AGRICULTURE, 2021, 188
  • [43] Ensemble-On-Demand Kalman Filter for Large-Scale Systems with Time-Sparse Measurements
    Kim, In Sung
    Teixeira, Bruno O. S.
    Bernstein, Dennis S.
    47TH IEEE CONFERENCE ON DECISION AND CONTROL, 2008 (CDC 2008), 2008, : 3199 - 3204
  • [44] Construction of Ensemble Learning Model for Home Appliance Demand Forecasting
    Duan, Ganglong
    Dong, Jiayi
    APPLIED SCIENCES-BASEL, 2024, 14 (17):
  • [45] An Approach for Demand Forecasting in Steel Industries Using Ensemble Learning
    Raju, S. M. Taslim Uddin
    Sarker, Amlan
    Das, Apurba
    Islam, Md Milon
    Al-Rakhami, Mabrook S.
    Al-Amri, Atif M.
    Mohiuddin, Tasniah
    Albogamy, Fahad R.
    COMPLEXITY, 2022, 2022
  • [46] An Online Anomaly Learning and Forecasting Model for Large-Scale Service of Internet of Thing
    Wang, JunPing
    Duan, Shihui
    2014 International Conference on Identification, Information and Knowledge in the Internet of Things (IIKI 2014), 2014, : 152 - 157
  • [47] LSEC: Large-scale spectral ensemble clustering
    Li, Hongmin
    Ye, Xiucai
    Imakura, Akira
    Sakurai, Tetsuya
    INTELLIGENT DATA ANALYSIS, 2023, 27 (01) : 59 - 77
  • [48] An ensemble bat algorithm for large-scale optimization
    Cai, Xingjuan
    Zhang, Jiangjiang
    Liang, Hao
    Wang, Lei
    Wu, Qidi
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2019, 10 (11) : 3099 - 3113
  • [49] An ensemble bat algorithm for large-scale optimization
    Xingjuan Cai
    Jiangjiang Zhang
    Hao Liang
    Lei Wang
    Qidi Wu
    International Journal of Machine Learning and Cybernetics, 2019, 10 : 3099 - 3113
  • [50] A framework for generating large-scale microphone array data for machine learning
    Kujawski, Adam
    Pelling, Art J. R.
    Jekosch, Simon
    Sarradj, Ennes
    MULTIMEDIA TOOLS AND APPLICATIONS, 2024, 83 (11) : 31211 - 31231