Redirecting to original paper in 30 seconds...
Click below to go immediately or wait for automatic redirect
This paper proposes Mixture of Routers (MoR), an innovative fine-tuning method that integrates Mixture-of-Experts (MoE) principles into the routing mechanism itself, inspired by Redundancy and Fault Tolerance Theory. MoR aims to address issues like incorrect assignments and imbalanced expert allocation in MoE, thereby enhancing the efficiency and performance of parameter-efficient fine-tuning methods like LoRA.
Enables more efficient and effective fine-tuning of large models, reducing training costs and time, and potentially leading to better-performing specialized models for various downstream tasks.