LiveKit

Real-time voice + TheRouter.ai model access

LiveKit Agents can use TheRouter.ai as the LLM backend so you can switch models/providers without changing agent orchestration code.

Overview

This page mirrors the OpenRouter workflow and adapts it for TheRouter.ai. Use TheRouter.ai as your OpenAI-compatible endpoint and keep model IDs in `provider/model` format.

Installation

Install the required SDKs and keep your TheRouter.ai key in environment variables.

install.sh
uv add "livekit-agents[openai]~=1.2"

Configuration

Set TheRouter.ai base URL overrides and pass your API key. Add attribution headers when your app should appear in rankings.

TypeScript
const openaiClient = new OpenAI({ baseURL: "https://api.therouter.ai/v1", apiKey: process.env.THEROUTER_API_KEY });

Caveats

Integration note
Voice pipelines are latency-sensitive; prefer lower TTFT models and keep fallback chains short for interactive sessions.

For production rollouts, pin SDK versions and validate model compatibility in staging before broad rollout.