Cross-platform Voice Agents with Expo React Native

Build conversational AI agents that work across iOS and Android using Expo and the ElevenLabs React Native SDK with WebRTC support.

Introduction

In this tutorial you will learn how to build a voice agent that works across iOS and Android using Expo React Native and the ElevenLabs React Native SDK with WebRTC support.

Prefer to jump straight to the code?

Find the example project on GitHub.

Requirements

  • An ElevenLabs account with an API key.
  • Node.js v18 or higher installed on your machine.

Setup

Create a new Expo project

Using create-expo-app, create a new blank Expo project:

$npx create-expo-app@latest --template blank-typescript

Install dependencies

Install the ElevenLabs React Native SDK and its dependencies:

$npx expo install @elevenlabs/react-native @livekit/react-native @livekit/react-native-webrtc @config-plugins/react-native-webrtc @livekit/react-native-expo-plugin @livekit/react-native-expo-plugin livekit-client

If you’re running into an issue with peer dependencies, please add a .npmrc file in the root of the project with the following content: legacy-peer-deps=true.

Enable microphone permissions and add Expo plugins

In the app.json file, add the following permissions:

app.json
1{
2 "expo": {
3 "scheme": "elevenlabs",
4 // ...
5 "ios": {
6 "infoPlist": {
7 "NSMicrophoneUsageDescription": "This app uses the microphone to record audio."
8 },
9 "supportsTablet": true,
10 "bundleIdentifier": "YOUR.BUNDLE.ID"
11 },
12 "android": {
13 "permissions": [
14 "android.permission.RECORD_AUDIO",
15 "android.permission.ACCESS_NETWORK_STATE",
16 "android.permission.CAMERA",
17 "android.permission.INTERNET",
18 "android.permission.MODIFY_AUDIO_SETTINGS",
19 "android.permission.SYSTEM_ALERT_WINDOW",
20 "android.permission.WAKE_LOCK",
21 "android.permission.BLUETOOTH"
22 ],
23 "adaptiveIcon": {
24 "foregroundImage": "./assets/adaptive-icon.png",
25 "backgroundColor": "#ffffff"
26 },
27 "package": "YOUR.PACKAGE.ID"
28 },
29 "plugins": ["@livekit/react-native-expo-plugin", "@config-plugins/react-native-webrtc"]
30 // ...
31 }
32}

This will allow the React Native to prompt for microphone permissions when the conversation is started.

Note

For Android emulator you will need to enable “Virtual microphone uses host audio input” in the emulator microphone settings.

Add ElevenLabs Conversational AI to your app

Add the ElevenLabs Conversational AI to your app by adding the following code to your ./App.tsx file:

./App.tsx
1import { ElevenLabsProvider, useConversation } from '@elevenlabs/react-native';
2import type { ConversationStatus, ConversationEvent, Role } from '@elevenlabs/react-native';
3import React, { useState } from 'react';
4import {
5 View,
6 Text,
7 StyleSheet,
8 TouchableOpacity,
9 Keyboard,
10 TouchableWithoutFeedback,
11 Platform,
12} from 'react-native';
13import { TextInput } from 'react-native';
14
15import { getBatteryLevel, changeBrightness, flashScreen } from './utils/tools';
16
17const ConversationScreen = () => {
18 const conversation = useConversation({
19 clientTools: {
20 getBatteryLevel,
21 changeBrightness,
22 flashScreen,
23 },
24 onConnect: ({ conversationId }: { conversationId: string }) => {
25 console.log('✅ Connected to conversation', conversationId);
26 },
27 onDisconnect: (details: string) => {
28 console.log('❌ Disconnected from conversation', details);
29 },
30 onError: (message: string, context?: Record<string, unknown>) => {
31 console.error('❌ Conversation error:', message, context);
32 },
33 onMessage: ({ message, source }: { message: ConversationEvent; source: Role }) => {
34 console.log(`💬 Message from ${source}:`, message);
35 },
36 onModeChange: ({ mode }: { mode: 'speaking' | 'listening' }) => {
37 console.log(`🔊 Mode: ${mode}`);
38 },
39 onStatusChange: ({ status }: { status: ConversationStatus }) => {
40 console.log(`📡 Status: ${status}`);
41 },
42 onCanSendFeedbackChange: ({ canSendFeedback }: { canSendFeedback: boolean }) => {
43 console.log(`🔊 Can send feedback: ${canSendFeedback}`);
44 },
45 });
46
47 const [isStarting, setIsStarting] = useState(false);
48 const [textInput, setTextInput] = useState('');
49
50 const handleSubmitText = () => {
51 if (textInput.trim()) {
52 conversation.sendUserMessage(textInput.trim());
53 setTextInput('');
54 Keyboard.dismiss();
55 }
56 };
57
58 const startConversation = async () => {
59 if (isStarting) return;
60
61 setIsStarting(true);
62 try {
63 await conversation.startSession({
64 agentId: process.env.EXPO_PUBLIC_AGENT_ID,
65 dynamicVariables: {
66 platform: Platform.OS,
67 },
68 });
69 } catch (error) {
70 console.error('Failed to start conversation:', error);
71 } finally {
72 setIsStarting(false);
73 }
74 };
75
76 const endConversation = async () => {
77 try {
78 await conversation.endSession();
79 } catch (error) {
80 console.error('Failed to end conversation:', error);
81 }
82 };
83
84 const getStatusColor = (status: ConversationStatus): string => {
85 switch (status) {
86 case 'connected':
87 return '#10B981';
88 case 'connecting':
89 return '#F59E0B';
90 case 'disconnected':
91 return '#EF4444';
92 default:
93 return '#6B7280';
94 }
95 };
96
97 const getStatusText = (status: ConversationStatus): string => {
98 return status[0].toUpperCase() + status.slice(1);
99 };
100
101 const canStart = conversation.status === 'disconnected' && !isStarting;
102 const canEnd = conversation.status === 'connected';
103
104 return (
105 <TouchableWithoutFeedback onPress={() => Keyboard.dismiss()}>
106 <View style={styles.container}>
107 <Text style={styles.title}>ElevenLabs React Native Example</Text>
108 <Text style={styles.subtitle}>Remember to set the agentId in the .env file!</Text>
109
110 <View style={styles.statusContainer}>
111 <View
112 style={[styles.statusDot, { backgroundColor: getStatusColor(conversation.status) }]}
113 />
114 <Text style={styles.statusText}>{getStatusText(conversation.status)}</Text>
115 </View>
116
117 {/* Speaking Indicator */}
118 {conversation.status === 'connected' && (
119 <View style={styles.speakingContainer}>
120 <View
121 style={[
122 styles.speakingDot,
123 {
124 backgroundColor: conversation.isSpeaking ? '#8B5CF6' : '#D1D5DB',
125 },
126 ]}
127 />
128 <Text
129 style={[
130 styles.speakingText,
131 { color: conversation.isSpeaking ? '#8B5CF6' : '#9CA3AF' },
132 ]}
133 >
134 {conversation.isSpeaking ? '🎤 AI Speaking' : '👂 AI Listening'}
135 </Text>
136 </View>
137 )}
138
139 <View style={styles.buttonContainer}>
140 <TouchableOpacity
141 style={[styles.button, styles.startButton, !canStart && styles.disabledButton]}
142 onPress={startConversation}
143 disabled={!canStart}
144 >
145 <Text style={styles.buttonText}>
146 {isStarting ? 'Starting...' : 'Start Conversation'}
147 </Text>
148 </TouchableOpacity>
149
150 <TouchableOpacity
151 style={[styles.button, styles.endButton, !canEnd && styles.disabledButton]}
152 onPress={endConversation}
153 disabled={!canEnd}
154 >
155 <Text style={styles.buttonText}>End Conversation</Text>
156 </TouchableOpacity>
157 </View>
158
159 {/* Feedback Buttons */}
160 {conversation.status === 'connected' && conversation.canSendFeedback && (
161 <View style={styles.feedbackContainer}>
162 <Text style={styles.feedbackLabel}>How was that response?</Text>
163 <View style={styles.feedbackButtons}>
164 <TouchableOpacity
165 style={[styles.button, styles.likeButton]}
166 onPress={() => conversation.sendFeedback(true)}
167 >
168 <Text style={styles.buttonText}>👍 Like</Text>
169 </TouchableOpacity>
170 <TouchableOpacity
171 style={[styles.button, styles.dislikeButton]}
172 onPress={() => conversation.sendFeedback(false)}
173 >
174 <Text style={styles.buttonText}>👎 Dislike</Text>
175 </TouchableOpacity>
176 </View>
177 </View>
178 )}
179
180 {/* Text Input and Messaging */}
181 {conversation.status === 'connected' && (
182 <View style={styles.messagingContainer}>
183 <Text style={styles.messagingLabel}>Send Text Message</Text>
184 <TextInput
185 style={styles.textInput}
186 value={textInput}
187 onChangeText={(text) => {
188 setTextInput(text);
189 // Prevent agent from interrupting while user is typing
190 if (text.length > 0) {
191 conversation.sendUserActivity();
192 }
193 }}
194 placeholder="Type your message or context... (Press Enter to send)"
195 multiline
196 onSubmitEditing={handleSubmitText}
197 returnKeyType="send"
198 blurOnSubmit={true}
199 />
200 <View style={styles.messageButtons}>
201 <TouchableOpacity
202 style={[styles.button, styles.messageButton]}
203 onPress={handleSubmitText}
204 disabled={!textInput.trim()}
205 >
206 <Text style={styles.buttonText}>💬 Send Message</Text>
207 </TouchableOpacity>
208 <TouchableOpacity
209 style={[styles.button, styles.contextButton]}
210 onPress={() => {
211 if (textInput.trim()) {
212 conversation.sendContextualUpdate(textInput.trim());
213 setTextInput('');
214 Keyboard.dismiss();
215 }
216 }}
217 disabled={!textInput.trim()}
218 >
219 <Text style={styles.buttonText}>📝 Send Context</Text>
220 </TouchableOpacity>
221 </View>
222 </View>
223 )}
224 </View>
225 </TouchableWithoutFeedback>
226 );
227};
228
229export default function App() {
230 return (
231 <ElevenLabsProvider>
232 <ConversationScreen />
233 </ElevenLabsProvider>
234 );
235}
236
237const styles = StyleSheet.create({
238 container: {
239 flex: 1,
240 justifyContent: 'center',
241 alignItems: 'center',
242 backgroundColor: '#F3F4F6',
243 padding: 20,
244 },
245 title: {
246 fontSize: 24,
247 fontWeight: 'bold',
248 marginBottom: 8,
249 color: '#1F2937',
250 },
251 subtitle: {
252 fontSize: 16,
253 color: '#6B7280',
254 marginBottom: 32,
255 },
256 statusContainer: {
257 flexDirection: 'row',
258 alignItems: 'center',
259 marginBottom: 24,
260 },
261 statusDot: {
262 width: 12,
263 height: 12,
264 borderRadius: 6,
265 marginRight: 8,
266 },
267 statusText: {
268 fontSize: 16,
269 fontWeight: '500',
270 color: '#374151',
271 },
272 speakingContainer: {
273 flexDirection: 'row',
274 alignItems: 'center',
275 marginBottom: 24,
276 },
277 speakingDot: {
278 width: 12,
279 height: 12,
280 borderRadius: 6,
281 marginRight: 8,
282 },
283 speakingText: {
284 fontSize: 14,
285 fontWeight: '500',
286 },
287 toolsContainer: {
288 backgroundColor: '#E5E7EB',
289 padding: 16,
290 borderRadius: 8,
291 marginBottom: 24,
292 width: '100%',
293 },
294 toolsTitle: {
295 fontSize: 14,
296 fontWeight: '600',
297 color: '#374151',
298 marginBottom: 8,
299 },
300 toolItem: {
301 fontSize: 12,
302 color: '#6B7280',
303 fontFamily: 'monospace',
304 marginBottom: 4,
305 },
306 buttonContainer: {
307 width: '100%',
308 gap: 16,
309 },
310 button: {
311 backgroundColor: '#3B82F6',
312 paddingVertical: 16,
313 paddingHorizontal: 32,
314 borderRadius: 8,
315 alignItems: 'center',
316 },
317 startButton: {
318 backgroundColor: '#10B981',
319 },
320 endButton: {
321 backgroundColor: '#EF4444',
322 },
323 disabledButton: {
324 backgroundColor: '#9CA3AF',
325 },
326 buttonText: {
327 color: 'white',
328 fontSize: 16,
329 fontWeight: '600',
330 },
331 instructions: {
332 marginTop: 24,
333 fontSize: 14,
334 color: '#6B7280',
335 textAlign: 'center',
336 lineHeight: 20,
337 },
338 feedbackContainer: {
339 marginTop: 24,
340 alignItems: 'center',
341 },
342 feedbackLabel: {
343 fontSize: 16,
344 fontWeight: '500',
345 color: '#374151',
346 marginBottom: 12,
347 },
348 feedbackButtons: {
349 flexDirection: 'row',
350 gap: 16,
351 },
352 likeButton: {
353 backgroundColor: '#10B981',
354 },
355 dislikeButton: {
356 backgroundColor: '#EF4444',
357 },
358 messagingContainer: {
359 marginTop: 24,
360 width: '100%',
361 },
362 messagingLabel: {
363 fontSize: 16,
364 fontWeight: '500',
365 color: '#374151',
366 marginBottom: 8,
367 },
368 textInput: {
369 backgroundColor: '#FFFFFF',
370 borderRadius: 8,
371 padding: 16,
372 minHeight: 100,
373 textAlignVertical: 'top',
374 borderWidth: 1,
375 borderColor: '#D1D5DB',
376 marginBottom: 16,
377 },
378 messageButtons: {
379 flexDirection: 'row',
380 gap: 16,
381 },
382 messageButton: {
383 backgroundColor: '#3B82F6',
384 flex: 1,
385 },
386 contextButton: {
387 backgroundColor: '#4F46E5',
388 flex: 1,
389 },
390 activityContainer: {
391 marginTop: 24,
392 alignItems: 'center',
393 },
394 activityLabel: {
395 fontSize: 14,
396 color: '#6B7280',
397 marginBottom: 8,
398 textAlign: 'center',
399 },
400 activityButton: {
401 backgroundColor: '#F59E0B',
402 },
403});

Native client tools

A big part of building conversational AI agents is allowing the agent access and execute functionality dynamically. This can be done via client tools.

Create a new file to hold your client tools: ./utils/tools.ts and add the following code:

./utils/tools.ts
1import * as Battery from 'expo-battery';
2import * as Brightness from 'expo-brightness';
3
4const getBatteryLevel = async () => {
5 const batteryLevel = await Battery.getBatteryLevelAsync();
6 console.log('batteryLevel', batteryLevel);
7 if (batteryLevel === -1) {
8 return 'Error: Device does not support retrieving the battery level.';
9 }
10 return batteryLevel;
11};
12
13const changeBrightness = ({ brightness }: { brightness: number }) => {
14 console.log('changeBrightness', brightness);
15 Brightness.setSystemBrightnessAsync(brightness);
16 return brightness;
17};
18
19const flashScreen = () => {
20 Brightness.setSystemBrightnessAsync(1);
21 setTimeout(() => {
22 Brightness.setSystemBrightnessAsync(0);
23 }, 200);
24 return 'Successfully flashed the screen.';
25};
26
27export { getBatteryLevel, changeBrightness, flashScreen };

Dynamic variables

In addition to the client tools, we’re also injecting the platform (web, iOS, Android) as a dynamic variable both into the first message, and the prompt:

./App.tsx
1// ...
2const startConversation = async () => {
3 if (isStarting) return;
4
5 setIsStarting(true);
6 try {
7 await conversation.startSession({
8 agentId: process.env.EXPO_PUBLIC_AGENT_ID,
9 dynamicVariables: {
10 platform: Platform.OS,
11 },
12 });
13 } catch (error) {
14 console.error('Failed to start conversation:', error);
15 } finally {
16 setIsStarting(false);
17 }
18};
19// ...

Agent configuration

1

Sign in to ElevenLabs

Go to elevenlabs.io and sign in to your account.

2

Create a new agent

Navigate to Conversational AI > Agents and create a new agent from the blank template.

3

Set the first message

Set the first message and specify the dynamic variable for the platform.

Hi there, woah, so cool that I'm running on {{platform}}. What can I help you with?
4

Set the system prompt

Set the system prompt. You can also include dynamic variables here.

You are a helpful assistant running on {{platform}}. You have access to certain tools that allow you to check the user device battery level and change the display brightness. Use these tools if the user asks about them. Otherwise, just answer the question.
5

Set up the client tools

Set up the following client tools:

  • Name: getBatteryLevel
    • Description: Gets the device battery level as decimal point percentage.
    • Wait for response: true
    • Response timeout (seconds): 3
  • Name: changeBrightness
    • Description: Changes the brightness of the device screen.
    • Wait for response: true
    • Response timeout (seconds): 3
    • Parameters:
      • Data Type: number
      • Identifier: brightness
      • Required: true
      • Value Type: LLM Prompt
      • Description: A number between 0 and 1, inclusive, representing the desired screen brightness.
  • Name: flashScreen
    • Description: Quickly flashes the screen on and off.
    • Wait for response: true
    • Response timeout (seconds): 3

Run the app

This app requires some native dependencies that aren’t supported in Expo Go, therefore you will need to prebuild the app and then run it on a native device.

  • Terminal 1:
    • Run npx expo prebuild --clean
$npx expo prebuild --clean
  • Run npx expo start --tunnel to start the Expo development server over https.
$npx expo start --tunnel
  • Terminal 2:
    • Run npx expo run:ios --device to run the app on your iOS device.
$npx expo run:ios --device