Overview

DocsOverview
Get Started

One API, Every LLM Provider. LLM AI Router gives you one endpoint for all your AI providers.

What is LLM AI Router?

LLM AI Router sits between your coding tools (Claude Code, Cursor, Cline, Codex CLI, etc.) and AI providers. Add your API keys, create stacks with fallback tiers, point your tools at llmairouter.com — done.

Instead of configuring each tool with each provider separately, you configure everything once in the router dashboard and use a single endpoint everywhere.

How it works

1

Sign up at llmairouter.com and log into your dashboard

2

Add your AI provider API keys (encrypted with AES-256-GCM)

3

Create stacks with multi-tier routing (primary → fallback → emergency)

4

Generate a Router API key from the API Keys tab

5

Point your CLI coding tools at https://llmairouter.com/api/v1

Key features

StacksNamed routing configurations with multiple tiers. If tier 1 fails, automatically fall to tier 2, then tier 3.
Circuit BreakerAutomatic provider health monitoring. Failed providers are bypassed until they recover.
Response CacheIdentical requests return cached results instantly, saving cost and latency.
AnalyticsTrack usage per provider, cost breakdown, latency, and request logs.
Encrypted KeysAll API keys encrypted with AES-256-GCM. Never stored in plaintext.