Redirecting to original paper in 30 seconds...

Click below to go immediately or wait for automatic redirect

arxiv_ai 90% Match Research paper NLP researchers,ML engineers,AI architects,Developers of LLMs 1 week ago

Differential Mamba

large-language-models › model-architecture
📄 Abstract

Abstract: Sequence models like Transformers and RNNs often overallocate attention to irrelevant context, leading to noisy intermediate representations. This degrades LLM capabilities by promoting hallucinations, weakening long-range and retrieval abilities, and reducing robustness. Recent work has shown that differential design can mitigate this issue in Transformers, improving their effectiveness across various applications. In this paper, we explore whether these techniques, originally developed for Transformers, can be applied to Mamba, a recent architecture based on selective state-space layers that achieves Transformer-level performance with greater efficiency. We show that a naive adaptation of differential design to Mamba is insufficient and requires careful architectural modifications. To address this, we introduce a novel differential mechanism for Mamba, empirically validated on language modeling benchmarks, demonstrating improved retrieval capabilities and superior performance over vanilla Mamba. Finally, we conduct extensive ablation studies and empirical analyses to justify our design choices and provide evidence that our approach effectively mitigates the overallocation problem in Mamba-based models. Our code is publicly available: https://github.com/NadavSc/Diff-Mamba
Authors (3)
Nadav Schneider
Itamar Zimerman
Eliya Nachmani
Submitted
July 8, 2025
arXiv Category
cs.LG
arXiv PDF

Key Contributions

This paper explores adapting 'differential design' techniques, originally for Transformers, to the Mamba architecture. It shows that naive adaptation is insufficient and introduces novel differential mechanisms for Mamba, empirically validated to improve retrieval capabilities and performance on language modeling benchmarks.

Business Value

Developing more efficient and capable sequence models like Mamba can lead to faster and more accurate AI applications, particularly in areas requiring long-context understanding and information retrieval.