Arize

Observability-first integration for AI workloads

Arize auto-instrumentation works with TheRouter.ai because TheRouter.ai is OpenAI-compatible. You keep your existing tracing pipelines and only change endpoint configuration.

Overview

This page mirrors the OpenRouter workflow and adapts it for TheRouter.ai. Use TheRouter.ai as your OpenAI-compatible endpoint and keep model IDs in `provider/model` format.

Installation

Install the required SDKs and keep your TheRouter.ai key in environment variables.

install.sh
pip install arize-otel openai openinference-instrumentation-openai

Configuration

Set TheRouter.ai base URL overrides and pass your API key. Add attribution headers when your app should appear in rankings.

TypeScript
const client = new OpenAI({ baseURL: "https://api.therouter.ai/v1", apiKey: process.env.THEROUTER_API_KEY });

Caveats

Integration note
Tracing can capture request metadata and prompts depending on your instrumentation settings; review redaction policy before enabling in regulated environments.

For production rollouts, pin SDK versions and validate model compatibility in staging before broad rollout.