Redirecting to original paper in 30 seconds...
Click below to go immediately or wait for automatic redirect
📄 Abstract
Abstract: Scaling laws research has focused overwhelmingly on English -- yet the most
prominent AI models explicitly serve billions of international users. In this
work, we undertake the largest multilingual scaling laws study to date,
totaling 774 multilingual training experiments, spanning 10M-8B model
parameters, 400+ training languages and 48 evaluation languages. We introduce
the Adaptive Transfer Scaling Law (ATLAS) for both monolingual and multilingual
pretraining, which outperforms existing scaling laws' out-of-sample
generalization often by more than 0.3 R^2. Our analyses of the experiments shed
light on multilingual learning dynamics, transfer properties between languages,
and the curse of multilinguality. First, we derive a cross-lingual transfer
matrix, empirically measuring mutual benefit scores between 38 x 38=1444
language pairs. Second, we derive a language-agnostic scaling law that reveals
how to optimally scale model size and data when adding languages without
sacrificing performance. Third, we identify the computational crossover points
for when to pretrain from scratch versus finetune from multilingual
checkpoints. We hope these findings provide the scientific foundation for
democratizing scaling laws across languages, and enable practitioners to
efficiently scale models -- beyond English-first AI.
Authors (9)
Shayne Longpre
Sneha Kudugunta
Niklas Muennighoff
I-Hung Hsu
Isaac Caswell
Alex Pentland
+3 more
Submitted
October 24, 2025
Key Contributions
This paper presents the largest multilingual scaling laws study to date, introducing the Adaptive Transfer Scaling Law (ATLAS) which significantly improves out-of-sample generalization for both monolingual and multilingual pretraining. It also provides empirical insights into cross-lingual transfer dynamics and the curse of multilinguality.
Business Value
Enables more efficient and effective development of AI models that serve a global user base, reducing training costs and improving performance for non-English languages.