Tool Calling
Enable models to call external functions and APIs.
Tool calling (also known as function calling) enables LLMs to interact with external systems. The model does not execute tools directly—instead, it returns a structured request indicating which tool to call and with what arguments. Your application then executes the tool and returns the result to the model for further processing.
OhMyGPT standardizes the tool calling interface across all providers, so you can use the same code regardless of the underlying model.
How tool calling works
The flow involves multiple turns between your application and the model:
- You define tools — Describe available functions using JSON Schema
- Model requests a tool — The model responds with
tool_callsinstead of text - You execute the tool — Run the function locally and collect the result
- You return the result — Send the tool output back to the model
- Model generates final response — The model uses the tool output to answer the user
Complete example
This example demonstrates a book search tool using the Project Gutenberg API.
Step 1: Set up the client and define tools
import json
import requests
from openai import OpenAI
client = OpenAI(
base_url="https://api.ohmygpt.com/v1",
api_key="<OHMYGPT_API_KEY>",
)
# Define the tool function
def search_gutenberg_books(search_terms: list[str]) -> list[dict]:
search_query = " ".join(search_terms)
response = requests.get(
"https://gutendex.com/books",
params={"search": search_query}
)
return [
{
"id": book.get("id"),
"title": book.get("title"),
"authors": book.get("authors")
}
for book in response.json().get("results", [])
]
# Define the tool specification (OpenAI format)
tools = [
{
"type": "function",
"function": {
"name": "search_gutenberg_books",
"description": "Search for books in the Project Gutenberg library",
"parameters": {
"type": "object",
"properties": {
"search_terms": {
"type": "array",
"items": {"type": "string"},
"description": "Search terms to find books"
}
},
"required": ["search_terms"]
}
}
}
]
TOOL_MAPPING = {
"search_gutenberg_books": search_gutenberg_books
}Step 2: Send the initial request
messages = [
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "What are some books by James Joyce?"}
]
response = client.chat.completions.create(
model="gpt-4o",
messages=messages,
tools=tools
)
assistant_message = response.choices[0].messageStep 3: Handle tool calls and return results
When the model requests a tool call, you must execute it and send the result back:
# Add the assistant's response to the conversation
messages.append(assistant_message)
# Process each tool call
for tool_call in assistant_message.tool_calls:
tool_name = tool_call.function.name
tool_args = json.loads(tool_call.function.arguments)
# Execute the tool
tool_result = TOOL_MAPPING[tool_name](**tool_args)
# Add the tool result to the conversation
messages.append({
"role": "tool",
"tool_call_id": tool_call.id,
"name": tool_name,
"content": json.dumps(tool_result)
})
# Get the final response
final_response = client.chat.completions.create(
model="gpt-4o",
messages=messages,
tools=tools
)
print(final_response.choices[0].message.content)The model will use the book search results to generate a natural language response listing James Joyce's available works.
Tool choice
Control how the model uses tools with the tool_choice parameter:
| Value | Behavior |
|---|---|
"auto" | Model decides whether to call a tool (default) |
"none" | Model will not call any tools |
"required" | Model must call at least one tool |
{"type": "function", "function": {"name": "..."}} | Force a specific tool |
Supported models
Most modern models support tool calling. Check the model's capabilities on the pricing page or test with a simple request. Models that do not support tools will return an error if you include the tools parameter.
OhMyGPT automatically converts tool specifications between formats (e.g., OpenAI to Anthropic) so you can use the same tool definitions across different models.
How is this guide?
Last updated on
