TanStack AI
Frontend AI streaming with TheRouter.ai adapters
TanStack AI works well for reactive frontends. TheRouter.ai adds provider flexibility and model routing without changing your component-level chat patterns.
Overview
This page mirrors the OpenRouter workflow and adapts it for TheRouter.ai. Use TheRouter.ai as your OpenAI-compatible endpoint and keep model IDs in `provider/model` format.
Installation
Install the required SDKs and keep your TheRouter.ai key in environment variables.
install.sh
npm install @tanstack/ai @tanstack/ai-openrouterConfiguration
Set TheRouter.ai base URL overrides and pass your API key. Add attribution headers when your app should appear in rankings.
TypeScript
const stream = chat({
adapter: openRouterText("openai/gpt-4o-mini"),
messages: [{ role: "user", content: "Hello" }],
});Caveats
Integration note
Browser apps should never expose long-lived production keys. Proxy requests through your backend or edge API route.
For production rollouts, pin SDK versions and validate model compatibility in staging before broad rollout.