Spatial Mixture-of-Experts

被引:0
|
作者
Dryden, Nikoli [1 ]
Hoefler, Torsten [1 ]
机构
[1] Swiss Fed Inst Technol, Zurich, Switzerland
基金
欧盟地平线“2020”;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Many data have an underlying dependence on spatial location; it may be weather on the Earth, a simulation on a mesh, or a registered image. Yet this feature is rarely taken advantage of, and violates common assumptions made by many neural network layers, such as translation equivariance. Further, many works that do incorporate locality fail to capture fine-grained structure. To address this, we introduce the Spatial Mixture-of-Experts (SMOE) layer, a sparsely-gated layer that learns spatial structure in the input domain and routes experts at a fine-grained level to utilize it. We also develop new techniques to train SMOEs, including a self-supervised routing loss and damping expert errors. Finally, we show strong results for SMOEs on numerous tasks, and set new state-of-the-art results for medium-range weather prediction and post-processing ensemble weather forecasts.
引用
收藏
页数:17
相关论文
共 50 条
  • [1] Asymptotic properties of mixture-of-experts models
    Olteanu, M.
    Rynkiewicz, J.
    [J]. NEUROCOMPUTING, 2011, 74 (09) : 1444 - 1449
  • [2] Efficient Routing in Sparse Mixture-of-Experts
    [J]. Shamsolmoali, Pourya (pshams55@gmail.com), 1600, Institute of Electrical and Electronics Engineers Inc.
  • [3] Mixture-of-Experts with Expert Choice Routing
    Zhou, Yanqi
    Lei, Tao
    Liu, Hanxiao
    Du, Nan
    Huang, Yanping
    Zhao, Vincent Y.
    Dai, Andrew
    Chen, Zhifeng
    Le, Quoc
    Laudon, James
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35, NEURIPS 2022, 2022,
  • [4] MoDE: A Mixture-of-Experts Model with Mutual Distillation among the Experts
    Xie, Zhitian
    Zhang, Yinger
    Zhuang, Chenyi
    Shi, Qitao
    Liu, Zhining
    Gu, Jinjie
    Zhang, Guannan
    [J]. THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 14, 2024, : 16067 - 16075
  • [5] A Universal Approximation Theorem for Mixture-of-Experts Models
    Nguyen, Hien D.
    Lloyd-Jones, Luke R.
    McLachlan, Geoffrey J.
    [J]. NEURAL COMPUTATION, 2016, 28 (12) : 2585 - 2593
  • [6] Semi-supervised mixture-of-experts classification
    Karakoulas, G
    Salakhutdinov, R
    [J]. FOURTH IEEE INTERNATIONAL CONFERENCE ON DATA MINING, PROCEEDINGS, 2004, : 138 - 145
  • [7] A mixture-of-experts framework for adaptive Kalman filtering
    Chaer, WS
    Bishop, RH
    Ghosh, J
    [J]. IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS PART B-CYBERNETICS, 1997, 27 (03): : 452 - 464
  • [8] A Multilevel Mixture-of-Experts Framework for Pedestrian Classification
    Enzweiler, Markus
    Gavrila, Dariu M.
    [J]. IEEE TRANSACTIONS ON IMAGE PROCESSING, 2011, 20 (10) : 2967 - 2979
  • [9] Measurement of the probability of insolvency with mixture-of-experts networks
    Baetge, J
    Jerschensky, A
    [J]. CLASSIFICATION IN THE INFORMATION AGE, 1999, : 421 - 429
  • [10] A similarity-based Bayesian mixture-of-experts model
    Tianfang Zhang
    Rasmus Bokrantz
    Jimmy Olsson
    [J]. Statistics and Computing, 2023, 33