Mastra
Mastra agent runtime with TheRouter.ai models
Mastra works well with TheRouter.ai through provider adapters, letting you keep agent workflows while expanding model and provider choices.
Overview
This page mirrors the OpenRouter workflow and adapts it for TheRouter.ai. Use TheRouter.ai as your OpenAI-compatible endpoint and keep model IDs in `provider/model` format.
Installation
Install the required SDKs and keep your TheRouter.ai key in environment variables.
install.sh
npm install @openrouter/ai-sdk-provider @mastra/coreConfiguration
Set TheRouter.ai base URL overrides and pass your API key. Add attribution headers when your app should appear in rankings.
TypeScript
const openrouter = createOpenRouter({ apiKey: process.env.THEROUTER_API_KEY });
const model = openrouter("anthropic/claude-sonnet-4.5");Caveats
Integration note
When multiple agents run concurrently, set per-agent spend limits and choose model tiers explicitly to avoid runaway costs.
For production rollouts, pin SDK versions and validate model compatibility in staging before broad rollout.