Redirecting to original paper in 30 seconds...
Click below to go immediately or wait for automatic redirect
📄 Abstract
Abstract: Recently, optimization on the Riemannian manifold have provided valuable
insights to the optimization community. In this regard, extending these methods
to to the Wasserstein space is of particular interest, since optimization on
Wasserstein space is closely connected to practical sampling processes.
Generally, the standard (continuous) optimization method on Wasserstein space
is Riemannian gradient flow (i.e., Langevin dynamics when minimizing KL
divergence). In this paper, we aim to enrich the family of continuous
optimization methods in the Wasserstein space, by extending the gradient flow
on it into the stochastic gradient descent (SGD) flow and stochastic variance
reduction gradient (SVRG) flow.
By leveraging the property of Wasserstein space, we construct stochastic
differential equations (SDEs) to approximate the corresponding discrete
Euclidean dynamics of the desired Riemannian stochastic methods. Then, we
obtain the flows in Wasserstein space by Fokker-Planck equation. Finally, we
establish convergence rates of the proposed stochastic flows, which align with
those known in the Euclidean setting.
Key Contributions
Extends continuous optimization methods in Wasserstein space by introducing continuous-time Riemannian SGD and SVRG flows. It leverages SDEs to approximate discrete dynamics and uses Fokker-Planck equations to derive these flows, enriching the family of optimization methods for sampling processes.
Business Value
Provides theoretical foundations for more advanced and efficient sampling techniques used in generative models, Bayesian inference, and other areas requiring complex probability distribution manipulation.