Redirecting to original paper in 30 seconds...
Click below to go immediately or wait for automatic redirect
📄 Abstract
Abstract: Denoising diffusion probabilistic models (DDPMs) represent a recent advance
in generative modelling that has delivered state-of-the-art results across many
domains of applications. Despite their success, a rigorous theoretical
understanding of the error within DDPMs, particularly the non-asymptotic bounds
required for the comparison of their efficiency, remain scarce. Making minimal
assumptions on the initial data distribution, allowing for example the manifold
hypothesis, this paper presents explicit non-asymptotic bounds on the forward
diffusion error in total variation (TV), expressed as a function of the
terminal time $T$.
We parametrise multi-modal data distributions in terms of the distance $R$ to
their furthest modes and consider forward diffusions with additive and
multiplicative noise. Our analysis rigorously proves that, under mild
assumptions, the canonical choice of the Ornstein-Uhlenbeck (OU) process cannot
be significantly improved in terms of reducing the terminal time $T$ as a
function of $R$ and error tolerance $\varepsilon>0$. Motivated by data
distributions arising in generative modelling, we also establish a cut-off like
phenomenon (as $R\to\infty$) for the convergence to its invariant measure in TV
of an OU process, initialized at a multi-modal distribution with maximal mode
distance $R$.