Uptime Optimization
Keep model access stable during provider incidents
Provider outages and transient rate limits are normal at scale. TheRouter.ai uptime features help you route around incidents automatically.
Reference payload
Use this baseline request shape and adapt model, provider sort strategy, and token limits to your workload.
request.json
{
"model": "anthropic/claude-sonnet-4.5",
"models": [
"anthropic/claude-sonnet-4.5",
"google/gemini-2.5-pro",
"openai/gpt-4o"
],
"route": "fallback"
}Configuration examples
TheRouter.ai keeps request semantics consistent across providers, so you can tune behavior without rewriting your app layer.
TypeScript
const payload = {
model: "anthropic/claude-sonnet-4.5",
models: ["anthropic/claude-sonnet-4.5", "openai/gpt-4o"],
route: "fallback",
provider: { allow_fallbacks: true, sort: "latency" },
messages: [{ role: "user", content: "Draft incident update" }],
};Production note
Operate with guardrails
Fallbacks improve reliability but may change output style across models. Add lightweight post-processing or evaluator checks.
Use the activity feed and usage exports to validate that these settings improve reliability and cost in your real traffic mix.