Voice to Video • LiveKit • Examples
February 1, 2026
Voice-to-Video LiveKit Plugin
Overview
This recipe takes an existing LiveKit voice agent and gives it a TruGen video avatar using a plugin. LiveKit still handles the real-time audio and signaling; TruGen renders the avatar and synchronizes it with the agent’s speech.
You’ll learn how to:
- Connect a LiveKit voice agent to a TruGen avatar stream.
- Configure both sides so media flows correctly.
- Run the combined voice+video experience in the browser.
What you’ll build
A small LiveKit + TruGen integration where:
- LiveKit manages the audio session (input/output).
- TruGen renders a synced video avatar.
- Users speak to the agent and see a face respond in real time.
Prerequisites
- A LiveKit project and voice agent.
- A TruGen account with access to video avatars or plugins.
- Basic TypeScript/JavaScript and WebRTC familiarity.
The complete example code lives at voice-to-video/livekit in the trugen-examples repository.
Project setup
1. Clone the example
git clone https://github.com/trugenai/trugen-examples.git
cd trugen-examples/voice-to-video/livekit
Install dependencies:
npm install
2. Configure LiveKit and TruGen
Update the project configuration with:
- LiveKit API keys and server URL.
- TruGen API key and avatar/plugin configuration.
You’ll typically provide:
- A LiveKit token for connecting to a room.
- A TruGen agent or avatar ID that should be paired with the voice agent.
You’ll usually configure this via environment variables like:
LIVEKIT_API_KEY=your_livekit_api_key
LIVEKIT_API_SECRET=your_livekit_api_secret
LIVEKIT_URL=wss://your-livekit-url
TRUGEN_AVATAR_ID=your_trugen_avatar_id
TRUGEN_API_KEY=your_trugen_api_key
Check the example README for the exact variable names.
Understanding the flow
At a high level:
- The client obtains a LiveKit access token from your backend.
- The client connects to a LiveKit room using the JavaScript SDK.
- The TruGen plugin is initialized with:
- The LiveKit room connection.
- The selected avatar configuration.
- When audio is generated by the agent, TruGen renders the avatar video in sync.
The browser ends up rendering:
- A LiveKit UI region (mute button, connection state, etc.).
- A TruGen avatar element (usually a
<video>or<canvas>inside a container).
Running the example
3. Start the dev server
Run the example:
npm run dev
Open the app in your browser and connect. You should see:
- A LiveKit connection UI.
- A TruGen avatar that animates and speaks based on the agent’s responses.
Customizing behaviour and appearance
From here you can:
- Swap avatars or personas in TruGen.
- Adjust camera layout or UI elements in the example.
- Tune audio and network settings in LiveKit to match your use case.
Common customizations include:
- Resizing or repositioning the avatar video.
- Changing background or framing.
- Showing additional UI for room participants or chat text.
Next steps
- Add authentication so only logged-in users can create LiveKit rooms.
- Log events from both LiveKit and TruGen for analytics.
- Extend the UI with controls like mute, end call, and avatar switching.
- Integrate this flow into a broader app (for example, an “agent with face” support center or sales assistant).