Langfuse
Prompt and generation observability on TheRouter.ai traffic
Langfuse's OpenAI wrapper can ingest TheRouter.ai calls with minimal changes, making it easy to keep historical trace continuity during migration.
Overview
This page mirrors the OpenRouter workflow and adapts it for TheRouter.ai. Use TheRouter.ai as your OpenAI-compatible endpoint and keep model IDs in `provider/model` format.
Installation
Install the required SDKs and keep your TheRouter.ai key in environment variables.
install.sh
pip install langfuse openaiConfiguration
Set TheRouter.ai base URL overrides and pass your API key. Add attribution headers when your app should appear in rankings.
TypeScript
const openai = new OpenAI({ baseURL: "https://api.therouter.ai/v1", apiKey: process.env.THEROUTER_API_KEY });Caveats
Integration note
Be deliberate about prompt capture in traces. For sensitive workloads, scrub user identifiers and secrets before sending to observability backends.
For production rollouts, pin SDK versions and validate model compatibility in staging before broad rollout.