Kotlin SDK
Conversational AI SDK: deploy customized, interactive voice agents in minutes for Android apps.
Refer to the Conversational AI overview for an explanation of how Conversational AI works.
Installation
Add the ElevenLabs SDK to your Android project by including the following dependency in your app-level build.gradle
file:
An example Android app using this SDK can be found here
Requirements
- Android API level 21 (Android 5.0) or higher
- Internet permission for API calls
- Microphone permission for voice input
- Network security configuration for HTTPS calls
Setup
Manifest Configuration
Add the necessary permissions to your AndroidManifest.xml
:
Runtime Permissions
For Android 6.0 (API level 23) and higher, you must request microphone permission at runtime:
Network Security Configuration
For apps targeting Android 9 (API level 28) or higher, ensure your network security configuration allows clear text traffic if needed:
Usage
Initialize the ElevenLabs SDK in your Application
class or main activity:
Start a conversation session with either:
- Public agent: pass
agentId
- Private agent: pass
conversationToken
provisioned from your backend (never expose your API key to the client).
Note that Conversational AI requires microphone access. Consider explaining and requesting permissions in your app’s UI before the conversation starts, especially on Android 6.0+ where runtime permissions are required.
Callbacks
The ConversationConfig can be configured with callbacks to handle conversation events:
- onConnect - Called when the WebRTC connection is established.
- onMessage - Called when a new message is received. These can be tentative or final transcriptions of user voice, replies produced by LLM, or debug messages.
- onModeChange - Called when the conversation mode changes. This is useful for indicating whether the agent is speaking or listening.
- onStatusChange - Called when the conversation status changes.
- onCanSendFeedbackChange - Called when the ability to send feedback changes.
- onUnhandledClientToolCall - Called when the agent requests a client tool that is not registered on the device.
- onVadScore - Called when the voice activity detection score changes.
Not all client events are enabled by default for an agent. If you have enabled a callback but aren’t seeing events come through, ensure that your Conversational AI agent has the corresponding event enabled. You can do this in the “Advanced” tab of the agent settings in the ElevenLabs dashboard.
Methods
startSession
The startSession
method initiates the WebRTC connection and starts using the microphone to communicate with the ElevenLabs Conversational AI agent.
Public agents
For public agents (i.e. agents that don’t have authentication enabled), only the agentId
is required. The Agent ID can be acquired through the ElevenLabs UI.
Private agents
For private agents, you must pass in a conversationToken
obtained from the ElevenLabs API. Generating this token requires an ElevenLabs API key.
conversationToken
is valid for 10 minutes.Then, pass the token to the startSession
method. Note that only the conversationToken
is required for private agents.
You can optionally pass a user ID to identify the user in the conversation. This can be your own customer identifier. This will be included in the conversation initiation data sent to the server.
endSession
A method to manually end the conversation. The method will disconnect and end the conversation.
sendUserMessage
Send a text message to the agent during an active conversation. This will trigger a response from the agent.
sendContextualUpdate
Sends contextual information to the agent that won’t trigger a response.
sendFeedback
Provide feedback on the conversation quality. This helps improve the agent’s performance.
sendUserActivity
Notifies the agent about user activity to prevent interruptions. Useful for when the user is actively using the app and the agent should pause speaking, i.e. when the user is typing in a chat.
The agent will pause speaking for ~2 seconds after receiving this signal.
Mute/ Unmute
Observe session.isMuted
to update the UI label between “Mute” and “Unmute”.
Properties
status
Get the current status of the conversation.
Example Implementation
For an example implementation, see the example app in the ElevenLabs Android SDK repository.