Skip to main content

How to pass run time values to a tool

Prerequisites

This guide assumes familiarity with the following concepts:

Supported models

This how-to guide uses models with native tool calling capability. You can find a list of all models that support tool calling.

You may need to bind values to a tool that are only known at runtime. For example, the tool logic may require using the ID of the user who made the request.

Most of the time, such values should not be controlled by the LLM. In fact, allowing the LLM to control the user ID may lead to a security risk.

Instead, the LLM should only control the parameters of the tool that are meant to be controlled by the LLM, while other parameters (such as user ID) should be fixed by the application logic.

This how-to guide shows a design pattern that creates the tool dynamically at run time and binds to them appropriate values.

We can bind them to chat models as follows:

Pick your chat model:

Install dependencies

yarn add @langchain/openai 

Add environment variables

OPENAI_API_KEY=your-api-key

Instantiate the model

import { ChatOpenAI } from "@langchain/openai";

const llm = new ChatOpenAI({
model: "gpt-3.5-turbo",
temperature: 0
});

Passing request time information

The idea is to create the tool dynamically at request time, and bind to it the appropriate information. For example, this information may be the user ID as resolved from the request itself.

import { z } from "zod";
import { tool } from "@langchain/core/tools";

const userToPets: Record<string, string[]> = {};

function generateToolsForUser(userId: string) {
const updateFavoritePets = tool(
async (input) => {
userToPets[userId] = input.pets;
return "update_favorite_pets called.";
},
{
name: "update_favorite_pets",
description: "add to the list of favorite pets.",
schema: z.object({
pets: z.array(z.string()),
}),
}
);

const deleteFavoritePets = tool(
async () => {
if (userId in userToPets) {
delete userToPets[userId];
}
return "delete_favorite_pets called.";
},
{
name: "delete_favorite_pets",
description: "Delete the list of favorite pets.",
schema: z.object({}),
}
);

const listFavoritePets = tool(
async () => {
return JSON.stringify(userToPets[userId] ?? []);
},
{
name: "list_favorite_pets",
description: "List favorite pets if any.",
schema: z.object({}),
}
);

return [updateFavoritePets, deleteFavoritePets, listFavoritePets];
}

Verify that the tools work correctly

const [updatePets, deletePets, listPets] = generateToolsForUser("brace");

await updatePets.invoke({ pets: ["cat", "dog"] });

console.log(userToPets);
console.log(await listPets.invoke({}));
{ brace: [ 'cat', 'dog' ] }
["cat","dog"]
import { BaseChatModel } from "@langchain/core/language_models/chat_models";

async function handleRunTimeRequest(
userId: string,
query: string,
llm: BaseChatModel
): Promise<any> {
if (!llm.bindTools) {
throw new Error("Language model does not support tools.");
}
const tools = generateToolsForUser(userId);
const llmWithTools = llm.bindTools(tools);
return llmWithTools.invoke(query);
}

This code will allow the LLM to invoke the tools, but the LLM is unaware of the fact that a user ID even exists! You can see that user_id is not among the params the LLM generates:

const aiMessage = await handleRunTimeRequest(
"brace",
"my favorite animals are cats and parrots.",
llm
);
console.log(aiMessage.tool_calls[0]);
{
name: 'update_favorite_pets',
args: { pets: [ 'cats', 'parrots' ] },
type: 'tool_call',
id: 'call_97h0nQ3B3cr0m58HOwq9ZyUz'
}
tip

Click here to see the LangSmith trace for the above run.

tip

Chat models only output requests to invoke tools. They don't actually invoke the underlying tools.

To see how to invoke the tools, please refer to how to use a model to call tools.


Was this page helpful?


You can also leave detailed feedback on GitHub.