Redirecting to original paper in 30 seconds...
Click below to go immediately or wait for automatic redirect
📄 Abstract
Abstract: Large language models (LLMs) have revolutionized natural language processing
and are increasingly applied to other sequential data types, including genetic
sequences. However, adapting LLMs to genomics presents significant challenges.
Capturing complex genomic interactions requires modeling long-range
dependencies within DNA sequences, where interactions often span over 10,000
base pairs, even within a single gene, posing substantial computational burdens
under conventional model architectures and training paradigms. Moreover,
standard LLM training approaches are suboptimal for DNA: autoregressive
training, while efficient, supports only unidirectional understanding. However,
DNA is inherently bidirectional, e.g., bidirectional promoters regulate
transcription in both directions and account for nearly 11% of human gene
expression. Masked language models (MLMs) allow bidirectional understanding but
are inefficient, as only masked tokens contribute to the loss per step. To
address these limitations, we introduce JanusDNA, the first bidirectional DNA
foundation model built upon a novel pretraining paradigm that combines the
optimization efficiency of autoregressive modeling with the bidirectional
comprehension of masked modeling. JanusDNA adopts a hybrid Mamba, Attention and
Mixture of Experts (MoE) architecture, combining long-range modeling of
Attention with efficient sequential learning of Mamba. MoE layers further scale
model capacity via sparse activation while keeping computational cost low.
Notably, JanusDNA processes up to 1 million base pairs at single nucleotide
resolution on a single 80GB GPU. Extensive experiments and ablations show
JanusDNA achieves new SOTA results on three genomic representation benchmarks,
outperforming models with 250x more activated parameters. Code:
https://github.com/Qihao-Duan/JanusDNA
Authors (7)
Qihao Duan
Bingding Huang
Zhenqiao Song
Irina Lehmann
Lei Gu
Roland Eils
+1 more
Key Contributions
This paper introduces JanusDNA, a powerful bi-directional hybrid DNA foundation model designed to overcome the challenges of applying LLMs to genomics. It addresses the need for modeling long-range dependencies and inherent bidirectionality in DNA sequences, proposing a hybrid approach that combines the strengths of autoregressive and masked language models for more effective and efficient genomic analysis.
Business Value
Accelerates genomic research and drug discovery by providing a more powerful and efficient tool for analyzing DNA sequences, enabling better understanding of genetic diseases and development of targeted therapies.