Redirecting to original paper in 30 seconds...
Click below to go immediately or wait for automatic redirect
📄 Abstract
Abstract: Tool calling has become increasingly popular for Large Language Models
(LLMs). However, for large tool sets, the resulting tokens would exceed the
LLM's context window limit, making it impossible to include every tool. Hence,
an external retriever is used to provide LLMs with the most relevant tools for
a query. Existing retrieval models rank tools based on the similarity between a
user query and a tool description (TD). This leads to suboptimal retrieval as
user requests are often poorly aligned with the language of TD. To remedy the
issue, we propose ToolDreamer, a framework to condition retriever models to
fetch tools based on hypothetical (synthetic) TD generated using an LLM, i.e.,
description of tools that the LLM feels will be potentially useful for the
query. The framework enables a more natural alignment between queries and tools
within the language space of TD's. We apply ToolDreamer on the ToolRet dataset
and show that our method improves the performance of sparse and dense
retrievers with and without training, thus showcasing its flexibility. Through
our proposed framework, our aim is to offload a portion of the reasoning burden
to the retriever so that the LLM may effectively handle a large collection of
tools without inundating its context window.
Authors (7)
Saptarshi Sengupta
Zhengyu Zhou
Jun Araki
Xingbo Wang
Bingqing Wang
Suhang Wang
+1 more
Submitted
October 22, 2025
Key Contributions
Proposes ToolDreamer, a framework that conditions retriever models to fetch tools based on hypothetical, LLM-generated descriptions. This improves the alignment between user queries and tools, leading to better retrieval performance compared to traditional methods relying solely on static tool descriptions.
Business Value
Enhances the capabilities of AI agents and LLMs by enabling them to more effectively discover and utilize external tools, leading to more powerful and versatile AI applications.