Redirecting to original paper in 30 seconds...

Click below to go immediately or wait for automatic redirect

arxiv_cl 90% Match Research Paper NLP Researchers,Machine Learning Engineers,Developers of LLMs 2 weeks ago

From Characters to Tokens: Dynamic Grouping with Hierarchical BPE

large-language-models › model-architecture
📄 Abstract

Abstract: Subword tokenization methods like Byte Pair Encoding (BPE) are widely used in large language models due to their balance of vocabulary compactness and representational power. However, they suffer from inefficiencies in representing rare words and require large embedding matrices. Character-level models address these issues but introduce performance bottlenecks, particularly in Transformer-based architectures. Recent hierarchical models attempt to merge the benefits of both paradigms by grouping characters into patches, but existing patching strategies either rely on whitespace-limiting applicability to certain languages, or require auxiliary models that introduce new dependencies. In this paper, we propose a dynamic character grouping method that leverages the structure of existing BPE tokenization without requiring additional models. By appending explicit end-of-patch markers to BPE tokens and introducing a second-level BPE compression stage to control patch granularity, our method offers efficient, flexible, and language-agnostic representations. Empirical results demonstrate that our approach matches or exceeds the performance of dynamic entropy- and whitespace-based patching strategies, while maintaining a compact vocabulary.
Authors (4)
Rares Dolga
Lucas Maystre
Tudor Berariu
David Barber
Submitted
October 17, 2025
arXiv Category
cs.CL
arXiv PDF

Key Contributions

This paper proposes a novel dynamic character grouping method that enhances Hierarchical BPE by leveraging existing BPE structures without auxiliary models. It introduces end-of-patch markers and a second-level BPE compression stage to control granularity, aiming to combine the benefits of subword and character-level tokenization.

Business Value

More efficient and effective tokenization can lead to smaller, faster, and more capable language models, reducing computational costs and improving performance in various NLP applications.