Redirecting to original paper in 30 seconds...
Click below to go immediately or wait for automatic redirect
📄 Abstract
Abstract: Diffusion-based generative processes, formulated as differential equation
solving, frequently balance computational speed with sample quality. Our
theoretical investigation of ODE- and SDE-based solvers reveals complementary
weaknesses: ODE solvers accumulate irreducible gradient error along
deterministic trajectories, while SDE methods suffer from amplified
discretization errors when the step budget is limited. Building upon this
insight, we introduce AdaSDE, a novel single-step SDE solver that aims to unify
the efficiency of ODEs with the error resilience of SDEs. Specifically, we
introduce a single per-step learnable coefficient, estimated via lightweight
distillation, which dynamically regulates the error correction strength to
accelerate diffusion sampling. Notably, our framework can be integrated with
existing solvers to enhance their capabilities. Extensive experiments
demonstrate state-of-the-art performance: at 5 NFE, AdaSDE achieves FID scores
of 4.18 on CIFAR-10, 8.05 on FFHQ and 6.96 on LSUN Bedroom. Codes are available
in https://github.com/WLU-wry02/AdaSDE.
Authors (5)
Ruoyu Wang
Beier Zhu
Junzhi Li
Liangyu Yuan
Chi Zhang
Submitted
October 27, 2025
Key Contributions
Introduces AdaSDE, a novel single-step SDE solver for diffusion models that accelerates sampling by dynamically regulating error correction strength using a learnable coefficient estimated via lightweight distillation. This aims to combine the efficiency of ODE solvers with the error resilience of SDE solvers.
Business Value
Enables faster generation of high-quality synthetic data, accelerating workflows in creative industries, data augmentation for training ML models, and content generation applications.