Redirecting to original paper in 30 seconds...

Click below to go immediately or wait for automatic redirect

arxiv_ai 95% Match Research Paper Generative AI Researchers,NLP Researchers,Machine Learning Engineers,Deep Learning Scientists 2 weeks ago

Latent Discrete Diffusion Models

generative-ai › diffusion
📄 Abstract

Abstract: We study discrete diffusion for language and other categorical data and focus on a common limitation of masked denoisers: reverse transitions typically factorize across positions, which can weaken joint structure and degrade quality in few-step generation. We propose \emph{Latent Discrete Diffusion Models} (LDDMs), which couple a masked discrete diffusion over tokens with a continuous diffusion over latent embeddings. The latent channel provides a softer signal and carries cross-token dependencies that help resolve ambiguities. We present two instantiations: (i) FUJI-LDDMs, which perform fully joint denoising of tokens and latents, and (ii) SEQ-LDDMs, which sequentially resolve the latent and then the discrete chain conditionally on it. For both variants we derive ELBO-style objectives and discuss design choices to learn informative latents yet amenable to diffusoin modeling. In experiments, LDDMs yield improvements on unconditional generation metrics as compared to state-of-the-art masked discrete diffusion baselines, and are effective at lower sampling budgets, where unmasking many tokens per step is desirable.
Authors (3)
Dario Shariatian
Alain Durmus
Stefano Peluchetti
Submitted
October 20, 2025
arXiv Category
cs.LG
arXiv PDF

Key Contributions

Latent Discrete Diffusion Models (LDDMs) address limitations in discrete diffusion by coupling a masked discrete diffusion over tokens with a continuous diffusion over latent embeddings. This latent channel captures cross-token dependencies, improving joint structure and few-step generation quality for categorical data like language, outperforming existing methods.

Business Value

Enables the generation of more coherent and higher-quality text and other discrete data, potentially leading to better AI writing assistants, creative tools, and more robust data augmentation techniques.