Redirecting to original paper in 30 seconds...

Click below to go immediately or wait for automatic redirect

arxiv_cl 95% Match Research Paper AI Researchers,ML Engineers,NLP Practitioners,Financial Technology Developers 19 hours ago

Merging Continual Pretraining Models for Domain-Specialized LLMs: A Case Study in Finance

large-language-models › training-methods
📄 Abstract

Abstract: While LLMs excel at general tasks, they struggle in specialized domains like finance, requiring diverse skills in domain knowledge, mathematical reasoning, and multilingual processing. Merging domain-specific Continual Pre-training (CPT) "experts" offers a practical alternative to costly and unstable multi-skill training. However, unlike established Supervised Fine-Tuning (SFT) model-based merging, CPT model merging remains largely unexplored. We address this gap by creating financial LLMs from experts in finance, math, and Japanese. We propose a three-stage evaluation focusing on knowledge recovery, complementarity, and emergence, and assess three merging methods (Task Arithmetic, TIES, and DARE-TIES) on a comprehensive financial benchmark curated from 18 tasks across 8 established datasets. Results show that merging an expert with its base model recovers general knowledge lost during CPT, while merging experts improves performance and can yield emergent cross-domain skills. Among the methods, Task Arithmetic performs strongly but is hyperparameter-sensitive, whereas TIES is more robust. Our findings also suggest that while model similarity correlates with merging success, emergent skills depend on more complex factors. This work presents the first foundational analysis of CPT model merging, establishing a principled framework and providing clear guidance for building multi-skill LLMs from existing assets.

Key Contributions

This paper pioneers the exploration of merging Continual Pre-training (CPT) models for creating domain-specialized LLMs, specifically in finance. It proposes a novel three-stage evaluation framework and assesses existing merging techniques (Task Arithmetic, TIES, DARE-TIES), demonstrating that merging CPT experts can recover lost general knowledge and yield emergent cross-domain skills, offering a practical alternative to multi-skill training.

Business Value

Enables the creation of highly capable, domain-specific LLMs for finance more efficiently, reducing training costs and improving performance on specialized tasks.