AdaDiff: Accelerating Diffusion Models Through Step-Wise Adaptive Computation

被引:0
|
作者
Tang, Shengkun
Wang, Yaqing [1 ]
Ding, Caiwen [2 ]
Liang, Yi [1 ]
Li, Yao [3 ]
Xu, Dongkuan [4 ]
机构
[1] Google, Menlo Pk, CA USA
[2] Univ Connecticut, Mansfield, PA USA
[3] UNC Chapel Hill, Chapel Hill, NC USA
[4] North Carolina State Univ, Raleigh, NC USA
来源
关键词
D O I
10.1007/978-3-031-72986-7_5
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Diffusion models achieve great success in generating diverse and high-fidelity images, yet their widespread application, especially in real-time scenarios, is hampered by their inherently slow generation speed. The slow generation stems from the necessity of multi-step network inference. While some certain predictions benefit from the full computation of the model in each sampling iteration, not every iteration requires the same amount of computation, potentially leading to inefficient computation. Unlike typical adaptive computation challenges that deal with single-step generation problems, diffusion processes with a multi-step generation need to dynamically adjust their computational resource allocation based on the ongoing assessment of each step's importance to the final image output, presenting a unique set of challenges. In this work, we propose AdaDiff, an adaptive framework that dynamically allocates computation resources in each sampling step to improve the generation efficiency of diffusion models. To assess the effects of changes in computational effort on image quality, we present a timestep-aware uncertainty estimation module (UEM). Integrated at each intermediate layer, the UEM evaluates the predictive uncertainty. This uncertainty measurement serves as an indicator for determining whether to terminate the inference process. Additionally, we introduce an uncertainty-aware layer-wise loss aimed at bridging the performance gap between full models and their adaptive counterparts. Comprehensive experiments including class-conditional, unconditional, and text-guided image generation across multiple datasets demonstrate superior performance and efficiency of AdaDiff relative to current early exiting techniques in diffusion models. Notably, we observe enhanced performance on FID, with an acceleration ratio reductio'n of around 45%. Another exciting observation is that adaptive computation can synergize with other efficiency-enhancing methods such as reducing sampling steps to accelerate inference.
引用
收藏
页码:73 / 90
页数:18
相关论文
共 50 条
  • [1] Outsourcing Service Provision Through Step-Wise Transformation
    Clark, Tony
    Barn, Balbir S.
    PROCEEDINGS OF THE 7TH INDIA SOFTWARE ENGINEERING CONFERENCE 2014, ISEC '14, 2014,
  • [2] Efficient Diffusion Transformer with Step-Wise Dynamic Attention Mediators
    Pu, Yifan
    Xia, Zhuofan
    Guo, Jiayi
    Han, Dongchen
    Li, Qixiu
    Li, Duo
    Yuan, Yuhui
    Li, Ji
    Han, Yizeng
    Song, Shiji
    Huang, Gao
    Li, Xiu
    COMPUTER VISION - ECCV 2024, PT XV, 2025, 15073 : 424 - 441
  • [3] Step-Wise Deep Learning Models for Solving Routing Problems
    Xin, Liang
    Song, Wen
    Cao, Zhiguang
    Zhang, Jie
    IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS, 2021, 17 (07) : 4861 - 4871
  • [4] Improving laboratory capacity in Africa through step-wise accreditation program
    Abimiku, Alash'le G.
    JAIDS-JOURNAL OF ACQUIRED IMMUNE DEFICIENCY SYNDROMES, 2013, 62 : 32 - 32
  • [5] Development of streamflow prediction models for a weir using ANN and step-wise regression
    Hassan M.
    Zaffar H.
    Mehmood I.
    Khitab A.
    Modeling Earth Systems and Environment, 2018, 4 (3) : 1021 - 1028
  • [6] Stochastic step-wise feature selection for Exponential Random Graph Models (ERGMs)
    El-Zaatari, Helal
    Yu, Fei
    Kosorok, Michael R.
    PLOS ONE, 2024, 19 (12):
  • [7] Basement membrane proteins in step-wise in vitro culture models of oral cancer progression
    Kulasekara, KMKK
    Costea, DE
    Oijorsbakken, G
    Vintermyr, OK
    Johannessen, AC
    ORAL ONCOLOGY, 2005, 1 (01) : 150 - 150
  • [8] A Step-Wise Multiple Testing for Linear Regression Models with Application to the Study of Resting Energy Expenditure
    Zhang, Junyi
    Wang, Zimian
    Jin, Zhezhen
    Ying, Zhiliang
    STATISTICS IN BIOSCIENCES, 2023, 15 (01) : 163 - 192
  • [9] A Step-Wise Multiple Testing for Linear Regression Models with Application to the Study of Resting Energy Expenditure
    Junyi Zhang
    Zimian Wang
    Zhezhen Jin
    Zhiliang Ying
    Statistics in Biosciences, 2023, 15 : 163 - 192
  • [10] Step-wise evolution of mental models of electric circuits: A "learning-aloud" case study
    Clement, JJ
    Steinberg, MS
    JOURNAL OF THE LEARNING SCIENCES, 2002, 11 (04) : 389 - 452