System tools

Update the internal state of conversations without external requests.

System tools enable your assistant to update the internal state of a conversation. Unlike server tools or client tools, system tools don’t make external API calls or trigger client-side functions—they modify the internal state of the conversation without making external calls.

Overview

Some applications require agents to control the flow or state of a conversation. System tools provide this capability by allowing the assistant to perform actions related to the state of the call that don’t require communicating with external servers or the client.

Available system tools

Implementation

When creating an agent via API, you can add system tools to your agent configuration. Here’s how to implement both the end call and language detection tools:

Custom LLM integration

When using a custom LLM with ElevenLabs agents, system tools are exposed as function definitions that your LLM can call. Each system tool has specific parameters and trigger conditions:

Available system tools

Custom LLM integration

Purpose: Automatically terminate conversations when appropriate conditions are met.

Trigger conditions: The LLM should call this tool when:

  • The main task has been completed and user is satisfied
  • The conversation reached natural conclusion with mutual agreement
  • The user explicitly indicates they want to end the conversation

Parameters:

  • reason (string, required): The reason for ending the call
  • message (string, optional): A farewell message to send to the user before ending the call

Function call format:

1{
2 "type": "function",
3 "function": {
4 "name": "end_call",
5 "arguments": "{\"reason\": \"Task completed successfully\", \"message\": \"Thank you for using our service. Have a great day!\"}"
6 }
7}

Implementation: Configure as a system tool in your agent settings. The LLM will receive detailed instructions about when to call this function.

Learn more: End call tool

Custom LLM integration

Purpose: Automatically switch to the user’s detected language during conversations.

Trigger conditions: The LLM should call this tool when:

  • User speaks in a different language than the current conversation language
  • User explicitly requests to switch languages
  • Multi-language support is needed for the conversation

Parameters:

  • reason (string, required): The reason for the language switch
  • language (string, required): The language code to switch to (must be in supported languages list)

Function call format:

1{
2 "type": "function",
3 "function": {
4 "name": "language_detection",
5 "arguments": "{\"reason\": \"User requested Spanish\", \"language\": \"es\"}"
6 }
7}

Implementation: Configure supported languages in agent settings and add the language detection system tool. The agent will automatically switch voice and responses to match detected languages.

Learn more: Language detection tool

Custom LLM integration

Purpose: Transfer conversations between specialized AI agents based on user needs.

Trigger conditions: The LLM should call this tool when:

  • User request requires specialized knowledge or different agent capabilities
  • Current agent cannot adequately handle the query
  • Conversation flow indicates need for different agent type

Parameters:

  • reason (string, optional): The reason for the agent transfer
  • agent_number (integer, required): Zero-indexed number of the agent to transfer to (based on configured transfer rules)

Function call format:

1{
2 "type": "function",
3 "function": {
4 "name": "transfer_to_agent",
5 "arguments": "{\"reason\": \"User needs billing support\", \"agent_number\": 0}"
6 }
7}

Implementation: Define transfer rules mapping conditions to specific agent IDs. Configure which agents the current agent can transfer to. Agents are referenced by zero-indexed numbers in the transfer configuration.

Learn more: Agent transfer tool

Custom LLM integration

Purpose: Seamlessly hand off conversations to human operators when AI assistance is insufficient.

Trigger conditions: The LLM should call this tool when:

  • Complex issues requiring human judgment
  • User explicitly requests human assistance
  • AI reaches limits of capability for the specific request
  • Escalation protocols are triggered

Parameters:

  • reason (string, optional): The reason for the transfer
  • transfer_number (string, required): The phone number to transfer to (must match configured numbers)
  • client_message (string, required): Message read to the client while waiting for transfer
  • agent_message (string, required): Message for the human operator receiving the call

Function call format:

1{
2 "type": "function",
3 "function": {
4 "name": "transfer_to_number",
5 "arguments": "{\"reason\": \"Complex billing issue\", \"transfer_number\": \"+15551234567\", \"client_message\": \"I'm transferring you to a billing specialist who can help with your account.\", \"agent_message\": \"Customer has a complex billing dispute about order #12345 from last month.\"}"
6 }
7}

Implementation: Configure transfer phone numbers and conditions. Define messages for both customer and receiving human operator. Works with both Twilio and SIP trunking.

Learn more: Transfer to human tool

Custom LLM integration

Purpose: Allow the agent to pause and wait for user input without speaking.

Trigger conditions: The LLM should call this tool when:

  • User indicates they need a moment (“Give me a second”, “Let me think”)
  • User requests pause in conversation flow
  • Agent detects user needs time to process information

Parameters:

  • reason (string, optional): Free-form reason explaining why the pause is needed

Function call format:

1{
2 "type": "function",
3 "function": {
4 "name": "skip_turn",
5 "arguments": "{\"reason\": \"User requested time to think\"}"
6 }
7}

Implementation: No additional configuration needed. The tool simply signals the agent to remain silent until the user speaks again.

Learn more: Skip turn tool

1from elevenlabs import (
2 ConversationalConfig,
3 ElevenLabs,
4 AgentConfig,
5 PromptAgent,
6 PromptAgentInputToolsItem_System,
7)
8
9# Initialize the client
10elevenlabs = ElevenLabs(api_key="YOUR_API_KEY")
11
12# Create system tools
13end_call_tool = PromptAgentInputToolsItem_System(
14 name="end_call",
15 description="" # Optional: Customize when the tool should be triggered
16)
17
18language_detection_tool = PromptAgentInputToolsItem_System(
19 name="language_detection",
20 description="" # Optional: Customize when the tool should be triggered
21)
22
23# Create the agent configuration with both tools
24conversation_config = ConversationalConfig(
25 agent=AgentConfig(
26 prompt=PromptAgent(
27 tools=[end_call_tool, language_detection_tool]
28 )
29 )
30)
31
32# Create the agent
33response = elevenlabs.conversational_ai.agents.create(
34 conversation_config=conversation_config
35)

FAQ

Yes, system tools can be used alongside server tools and client tools in the same assistant. This allows for comprehensive functionality that combines internal state management with external interactions.