Principles
Core values behind TheRouter.ai's multi-model platform
TheRouter.ai helps teams run production AI workloads across multiple models and providers with one integration. Our architecture and product choices follow a small set of principles that prioritize quality, speed, and operational resilience.
1. Price and performance optimization
TheRouter.ai continuously evaluates provider options so you can optimize for cost, latency, throughput, or a combination of all three. You can keep a preferred provider order while allowing automatic fallbacks.
{
"provider": {
"order": ["aws-bedrock", "openai", "anthropic"],
"allow_fallbacks": true,
"sort": "latency"
}
}2. Standardized API
Model switching should not require application rewrites. TheRouter.ai exposes an OpenAI-compatible interface so teams can move between providers and model families without changing request structure.
import OpenAI from "openai";
const client = new OpenAI({
baseURL: "https://api.therouter.ai/v1",
apiKey: process.env.THEROUTER_API_KEY,
});
const response = await client.chat.completions.create({
model: "anthropic/claude-sonnet-4.5",
messages: [{ role: "user", content: "Summarize this release note." }],
});3. Consolidated billing
Teams should not need separate metering pipelines for each provider. TheRouter.ai unifies usage accounting and billing, while still exposing detailed model and provider-level usage data.
4. High availability by design
Model APIs can degrade or fail unexpectedly. TheRouter.ai is built around fallback chains, provider diversity, and smart retry behavior so your traffic keeps flowing when one path is unavailable.
Learn more
Continue with Models to understand TheRouter.ai model naming, metadata, and discovery.