Connect your assistant to external data & systems.

Tools enable your assistant to connect to external data and systems. You can define a set of tools that the assistant has access to, and the assistant will use them where appropriate based on the conversation.

Overview

Many applications require assistants to call external APIs to get real-time information. Tools give your assistant the ability to make external function calls to third party apps so you can get real-time information.

Here are a few examples where tools can be useful:

  • Fetching data: enable an assistant to retrieve real-time data from any REST-enabled database or 3rd party integration before responding to the user.
  • Taking action: allow an assistant to trigger authenticated actions based on the conversation, like scheduling meetings or initiating order returns.

To interact with Application UIs or trigger client-side events use client tools instead.

Tool configuration

Conversational AI assistants can be equipped with tools to interact with external APIs. Unlike traditional requests, the assistant generates query, body, and path parameters dynamically based on the conversation and parameter descriptions you provide.

All tool configurations and parameter descriptions help the assistant determine when and how to use these tools. To orchestrate tool usage effectively, update the assistant’s system prompt to specify the sequence and logic for making these calls. This includes:

  • Which tool to use and under what conditions.
  • What parameters the tool needs to function properly.
  • How to handle the responses.

Define a high-level Name and Description to describe the tool’s purpose. This helps the LLM understand the tool and know when to call it.

If the API requires path parameters, include variables in the URL path by wrapping them in curly braces {}, for example: /api/resource/{id} where id is a path parameter.

Configuration

Guide

In this guide, we’ll create a weather assistant that can provide real-time weather information for any location. The assistant will use its geographic knowledge to convert location names into coordinates and fetch accurate weather data.

1

Configure the weather tool

First, on the Agent section of your agent settings page, choose Add Tool. Select Webhook as the Tool Type, then configure the weather API integration:

An API key is not required for this tool. If one is required, this should be passed in the headers and stored as a secret.

2

Orchestration

Configure your assistant to handle weather queries intelligently with this system prompt:

System prompt
You are a helpful conversational AI assistant with access to a weather tool. When users ask about
weather conditions, use the get_weather tool to fetch accurate, real-time data. The tool requires
a latitude and longitude - use your geographic knowledge to convert location names to coordinates
accurately.
Never ask users for coordinates - you must determine these yourself. Always report weather
information conversationally, referring to locations by name only. For weather requests:
1. Extract the location from the user's message
2. Convert the location to coordinates and call get_weather
3. Present the information naturally and helpfully
For non-weather queries, provide friendly assistance within your knowledge boundaries. Always be
concise, accurate, and helpful.
First message: "Hey, how can I help you today?"

Test your assistant by asking about the weather in different locations. The assistant should handle specific locations (“What’s the weather in Tokyo?”) and ask for clarification after general queries (“How’s the weather looking today?”).

Best practices

Name tools intuitively, with detailed descriptions

If you find the assistant does not make calls to the correct tools, you may need to update your tool names and descriptions so the assistant more clearly understands when it should select each tool. Avoid using abbreviations or acronyms to shorten tool and argument names.

You can also include detailed descriptions for when a tool should be called. For complex tools, you should include descriptions for each of the arguments to help the assistant know what it needs to ask the user to collect that argument.

Name tool parameters intuitively, with detailed descriptions

Use clear and descriptive names for tool parameters. If applicable, specify the expected format for a parameter in the description (e.g., YYYY-mm-dd or dd/mm/yy for a date).

Consider providing additional information about how and when to call tools in your assistant’s system prompt

Providing clear instructions in your system prompt can significantly improve the assistant’s tool calling accuracy. For example, guide the assistant with instructions like the following:

Use `check_order_status` when the user inquires about the status of their order, such as 'Where is my order?' or 'Has my order shipped yet?'.

Provide context for complex scenarios. For example:

Before scheduling a meeting with `schedule_meeting`, check the user's calendar for availability using check_availability to avoid conflicts.

LLM selection

When using tools, we recommend picking high intelligence models like GPT-4o mini or Claude 3.5 Sonnet and avoiding Gemini 1.5 Flash.

It’s important to note that the choice of LLM matters to the success of function calls. Some LLMs can struggle with extracting the relevant parameters from the conversation.

Built with