Tool Calling

Connect Dynamo to external tools and services using function calling
View as Markdown

You can connect Dynamo to external tools and services using function calling (also known as tool calling). By providing a list of available functions, Dynamo can choose to output function arguments for the relevant function(s) which you can execute to augment the prompt with relevant external information.

Tool calling (AKA function calling) is controlled using the tool_choice and tools request parameters.

Prerequisites

To enable this feature, you should set the following flag while launching the backend worker

  • --dyn-tool-call-parser : select the parser from the available parsers list using the below command
$# <backend> can be vllm, sglang, trtllm, etc. based on your installation
$python -m dynamo.<backend> --help"

[!NOTE] If no tool call parser is provided by the user, Dynamo will try to use default tool call parsing based on <TOOLCALL> and <|python_tag|> tool tags.

[!TIP] If your model’s default chat template doesn’t support tool calling, but the model itself does, you can specify a custom chat template per worker with python -m dynamo.<backend> --custom-jinja-template </path/to/template.jinja>.

Parser to Model Mapping

Parser NameSupported Models
hermesQwen/Qwen2.5-, Qwen/QwQ-32B, NousResearch/Hermes-2-Pro-, NousResearch/Hermes-2-Theta-, NousResearch/Hermes-3-
mistralmistralai/Mistral-7B-Instruct-v0.3, Additional mistral function-calling models are compatible as well.
llama3_jsonmeta-llama/Llama-3.1-, meta-llama/Llama-3.2-
harmonyopenai/gpt-oss-*
nemotron_decinvidia/nemotron-*
phi4Phi-4-*
deepseek_v3deepseek-ai/DeepSeek-V3, deepseek-ai/DeepSeek-R1, deepseek-ai/DeepSeek-R1-0528
deepseek_v3_1deepseek-ai/DeepSeek-V3.1
pythonicmeta-llama/Llama-4-*
jambaai21labs/AI21-Jamba--1.5, ai21labs/AI21-Jamba--1.6, ai21labs/AI21-Jamba-*-1.7,

Examples

Launch Dynamo Frontend and Backend

$# launch backend worker
$python -m dynamo.vllm --model openai/gpt-oss-20b --dyn-tool-call-parser harmony
$
$# launch frontend worker
$python -m dynamo.frontend

Tool Calling Request Examples

  • Example 1
1from openai import OpenAI
2import json
3
4client = OpenAI(base_url="http://localhost:8081/v1", api_key="dummy")
5
6def get_weather(location: str, unit: str):
7 return f"Getting the weather for {location} in {unit}..."
8tool_functions = {"get_weather": get_weather}
9
10tools = [{
11 "type": "function",
12 "function": {
13 "name": "get_weather",
14 "description": "Get the current weather in a given location",
15 "parameters": {
16 "type": "object",
17 "properties": {
18 "location": {"type": "string", "description": "City and state, e.g., 'San Francisco, CA'"},
19 "unit": {"type": "string", "enum": ["celsius", "fahrenheit"]}
20 },
21 "required": ["location", "unit"]
22 }
23 }
24}]
25
26response = client.chat.completions.create(
27 model="openai/gpt-oss-20b",
28 messages=[{"role": "user", "content": "What's the weather like in San Francisco in Celsius?"}],
29 tools=tools,
30 tool_choice="auto",
31 max_tokens=10000
32)
33print(f"{response}")
34tool_call = response.choices[0].message.tool_calls[0].function
35print(f"Function called: {tool_call.name}")
36print(f"Arguments: {tool_call.arguments}")
37print(f"Result: {tool_functions[tool_call.name](**json.loads(tool_call.arguments))}")
  • Example 2
1# Use tools defined in example 1
2
3time_tool = {
4 "type": "function",
5 "function": {
6 "name": "get_current_time_nyc",
7 "description": "Get the current time in NYC.",
8 "parameters": {}
9 }
10}
11
12
13tools.append(time_tool)
14
15messages = [
16 {"role": "user", "content": "What's the current time in New York?"}
17]
18
19
20response = client.chat.completions.create(
21 model="openai/gpt-oss-20b", #client.models.list().data[1].id,
22 messages=messages,
23 tools=tools,
24 tool_choice="auto",
25 max_tokens=100,
26)
27print(f"{response}")
28tool_call = response.choices[0].message.tool_calls[0].function
29print(f"Function called: {tool_call.name}")
30print(f"Arguments: {tool_call.arguments}")
  • Example 3
1tools = [
2 {
3 "type": "function",
4 "function": {
5 "name": "get_tourist_attractions",
6 "description": "Get a list of top tourist attractions for a given city.",
7 "parameters": {
8 "type": "object",
9 "properties": {
10 "city": {
11 "type": "string",
12 "description": "The name of the city to find attractions for.",
13 }
14 },
15 "required": ["city"],
16 },
17 },
18 },
19]
20
21def get_messages():
22 return [
23 {
24 "role": "user",
25 "content": (
26 "I'm planning a trip to Tokyo next week. what are some top tourist attractions in Tokyo? "
27 ),
28 },
29 ]
30
31
32messages = get_messages()
33
34response = client.chat.completions.create(
35 model="openai/gpt-oss-20b",
36 messages=messages,
37 tools=tools,
38 tool_choice="auto",
39 max_tokens=100,
40)
41print(f"{response}")
42tool_call = response.choices[0].message.tool_calls[0].function
43print(f"Function called: {tool_call.name}")
44print(f"Arguments: {tool_call.arguments}")