Convolutional State Space Models for Long-Range Spatiotemporal Modeling

被引:0
|
作者
Smith, Jimmy T. H. [2 ,4 ]
De Mello, Shalini [1 ]
Kautz, Jan [1 ]
Linderman, Scott W. [3 ,4 ]
Byeon, Wonmin [1 ]
机构
[1] NVIDIA, Santa Clara, CA USA
[2] Stanford Univ, Inst Computat & Math Engn, Stanford, CA 94305 USA
[3] Stanford Univ, Dept Stat, Stanford, CA 94305 USA
[4] Stanford Univ, Wu Tsai Neurosci Inst, Stanford, CA 94305 USA
关键词
TIME;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Effectively modeling long spatiotemporal sequences is challenging due to the need to model complex spatial correlations and long-range temporal dependencies simultaneously. ConvLSTMs attempt to address this by updating tensor-valued states with recurrent neural networks, but their sequential computation makes them slow to train. In contrast, Transformers can process an entire spatiotemporal sequence, compressed into tokens, in parallel. However, the cost of attention scales quadratically in length, limiting their scalability to longer sequences. Here, we address the challenges of prior methods and introduce convolutional state space models (ConvSSM)(1) that combine the tensor modeling ideas of ConvLSTM with the long sequence modeling approaches of state space methods such as S4 and S5. First, we demonstrate how parallel scans can be applied to convolutional recurrences to achieve subquadratic parallelization and fast autoregressive generation. We then establish an equivalence between the dynamics of ConvSSMs and SSMs, which motivates parameterization and initialization strategies for modeling long-range dependencies. The result is ConvS5, an efficient ConvSSM variant for long-range spatiotemporal modeling. ConvS5 significantly outperforms Transformers and ConvLSTM on a long horizon Moving-MNIST experiment while training 3x faster than ConvLSTM and generating samples 400x faster than Transformers. In addition, ConvS5 matches or exceeds the performance of state-of-the-art methods on challenging DMLab, Minecraft and Habitat prediction benchmarks and enables new directions for modeling long spatiotemporal sequences.
引用
收藏
页数:40
相关论文
共 50 条
  • [1] State-space Modeling of long-range dependent teletraffic
    de Lima, Alexandre B.
    Amazonas, Jose R. de A.
    MANAGING TRAFFIC PERFORMANCE IN CONVERGED NETWORKS, 2007, 4516 : 260 - +
  • [2] Spatiotemporal generation of long-range dependence models and estimation
    Frías, MP
    Ruiz-Medina, MD
    Alonso, FJ
    Angulo, JM
    ENVIRONMETRICS, 2006, 17 (02) : 139 - 146
  • [3] Spatiotemporal quenches in long-range Hamiltonians
    Bernier, Simon
    Agarwal, Kartiek
    PHYSICAL REVIEW B, 2023, 108 (02)
  • [4] LONG-RANGE SPACE GOAL
    KRAMER, SB
    ASTRONAUTICS & AERONAUTICS, 1967, 5 (05): : 4 - &
  • [5] Do Long-Range Language Models Actually Use Long-Range Context?
    Sun, Simeng
    Krishna, Kalpesh
    Mattarella-Micke, Andrew
    Iyyer, Mohit
    2021 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2021), 2021, : 807 - 822
  • [6] A REVIEW OF LONG-RANGE TRANSPORT MODELING
    ELIASSEN, A
    JOURNAL OF APPLIED METEOROLOGY, 1980, 19 (03): : 231 - 240
  • [7] MULTIPERIPHERAL MODELS WITH LONG-RANGE CORRELATIONS
    STEINHOFF, J
    NUCLEAR PHYSICS B, 1974, B 82 (03) : 461 - 476
  • [8] Correlation models with long-range dependence
    Ma, CS
    JOURNAL OF APPLIED PROBABILITY, 2002, 39 (02) : 370 - 382
  • [9] Long-range transition state theory
    Georgievskii, Y
    Klippenstein, SJ
    JOURNAL OF CHEMICAL PHYSICS, 2005, 122 (19):
  • [10] Long-range dependence of Markov chains in discrete time on countable state space
    Carpio, K. J. E.
    Daley, D. J.
    JOURNAL OF APPLIED PROBABILITY, 2007, 44 (04) : 1047 - 1055