Redirecting to original paper in 30 seconds...
Click below to go immediately or wait for automatic redirect
📄 Abstract
Abstract: Simulation-free training frameworks have been at the forefront of the
generative modelling revolution in continuous spaces, leading to large-scale
diffusion and flow matching models. However, such modern generative models
suffer from expensive inference, inhibiting their use in numerous scientific
applications like Boltzmann Generators (BGs) for molecular conformations that
require fast likelihood evaluation. In this paper, we revisit classical
normalizing flows in the context of BGs that offer efficient sampling and
likelihoods, but whose training via maximum likelihood is often unstable and
computationally challenging. We propose Regression Training of Normalizing
Flows (RegFlow), a novel and scalable regression-based training objective that
bypasses the numerical instability and computational challenge of conventional
maximum likelihood training in favour of a simple $\ell_2$-regression
objective. Specifically, RegFlow maps prior samples under our flow to targets
computed using optimal transport couplings or a pre-trained continuous
normalizing flow (CNF). To enhance numerical stability, RegFlow employs
effective regularization strategies such as a new forward-backward
self-consistency loss that enjoys painless implementation. Empirically, we
demonstrate that RegFlow unlocks a broader class of architectures that were
previously intractable to train for BGs with maximum likelihood. We also show
RegFlow exceeds the performance, computational cost, and stability of maximum
likelihood training in equilibrium sampling in Cartesian coordinates of alanine
dipeptide, tripeptide, and tetrapeptide, showcasing its potential in molecular
systems.
Authors (8)
Danyal Rehman
Oscar Davis
Jiarui Lu
Jian Tang
Michael Bronstein
Yoshua Bengio
+2 more
Key Contributions
JSON parse error: Unexpected token ' in JSON at position 48079