Redirecting to original paper in 30 seconds...
Click below to go immediately or wait for automatic redirect
📄 Abstract
Abstract: The capabilities of large language models (LLMs) are widely regarded as
relying on autoregressive models (ARMs). We challenge this notion by
introducing LLaDA, a diffusion model trained from scratch under the
pre-training and supervised fine-tuning (SFT) paradigm. LLaDA employs a forward
data masking process and a reverse generation process, parameterized by a
Transformer to predict masked tokens. It provides a principled generative
approach for probabilistic inference by optimizing a likelihood lower bound.
Across extensive benchmarks on general tasks, math, code, and so on, LLaDA
demonstrates strong scalability and performs comparably to our self-constructed
ARM baselines. Remarkably, LLaDA 8B is competitive with strong LLMs like LLaMA3
8B in in-context learning and, after SFT, exhibits impressive
instruction-following abilities in case studies such as multi-turn dialogue.
Moreover, LLaDA addresses the reversal curse, surpassing GPT-4o in a reversal
poem completion task. Our findings show the promise of diffusion models for
language modeling at scale and challenge the common assumption that core LLM
capabilities discussed above inherently depend on ARMs. Project page and codes:
https://ml-gsai.github.io/LLaDA-demo/.
Authors (10)
Shen Nie
Fengqi Zhu
Zebin You
Xiaolu Zhang
Jingyang Ou
Jun Hu
+4 more
Submitted
February 14, 2025
Key Contributions
Introduces LLaDA, a diffusion model trained from scratch for language generation, challenging the dominance of autoregressive models. LLaDA demonstrates strong scalability and competitive performance with ARMs across various tasks, exhibits impressive instruction-following abilities after SFT, and addresses the 'reversal curse'.
Business Value
Offers a new paradigm for building powerful LLMs, potentially leading to more efficient training, novel generation capabilities, and improved performance in specific tasks like code generation or complex instruction following.