A protocol-faithful, multi-provider AI gateway written in Rust.
Route requests across OpenAI, Anthropic, Google Gemini, and the entire OpenAI-compatible ecosystem — with native wire types, streaming SSE, and zero lowest-common-denominator abstractions.
Most multi-provider LLM wrappers force every provider into a single "universal" message format, silently dropping fields and features along the way. AI Gateway takes the opposite approach:
- Protocol-faithful — Each provider crate models the upstream API exactly as documented. No fields are silently dropped, no semantics are lost.
- Forward-compatible — All wire types carry
#[serde(flatten)] extrato survive upstream API changes without recompilation. - Translation at the edge — Cross-provider mapping happens at the gateway layer via
TryFrom/Into, not inside individual providers. - Streaming-first — Full SSE event fidelity for every provider.
flowchart TB
Client([Your Application])
Client -->|Unified API| Gateway
subgraph Gateway["⚡ AI Gateway"]
direction TB
Router[Request Router]
Translate[Protocol Translation]
Stream[SSE Streaming Engine]
Router --> Translate --> Stream
end
subgraph Providers["Provider Crates"]
direction LR
OpenAI["aigw-openai"]
Anthropic["aigw-anthropic"]
Compat["aigw-openai-compat"]
Gemini["aigw-gemini"]
end
Gateway --> OpenAI & Anthropic & Compat & Gemini
OpenAI -->|Responses API\nChat Completions| OpenAI_API["OpenAI API"]
Anthropic -->|Messages API| Anthropic_API["Anthropic API"]
Compat -->|Chat Completions| Compat_APIs["Groq · Together\nvLLM · Fireworks\nPerplexity · LM Studio"]
Gemini -->|generateContent| Gemini_API["Google Gemini API"]
| Provider | Crate | Chat | Streaming | Status | |
|---|---|---|---|---|---|
| OpenAI | aigw-openai |
✅ | ✅ | 🚧 Active | |
| Anthropic | aigw-anthropic |
✅ | ✅ | 🚧 Active | |
| Google Gemini | aigw-gemini |
– | – | 🏗️ Skeleton |
OpenAI-Compatible Providers via aigw-openai-compat
Configure any OpenAI-compatible provider with a base_url + Quirks capability flags — no new crate needed.
Adding a new provider? If the API is OpenAI-compatible, add a
Quirksconfig toaigw-openai-compat. Only create a new crate for providers with a distinct wire format.
Each provider crate defines its own native request/response types mirroring the upstream API. Translation between providers is handled via TryFrom/Into at the gateway layer.
Third-party OpenAI-compatible providers declare their capabilities through a Quirks struct. Unsupported fields are stripped before sending, not silently ignored.
Quirks {
supports_responses_api: false,
supports_tool_choice: true,
supports_parallel_tool_calls: false,
supports_vision: true,
supports_streaming: true,
}API keys are stored as secrecy::SecretString — they never implement Debug, never appear in logs.
cargo build --workspace # Build
cargo test --workspace # Test
cargo clippy --workspace # LintSee CONTRIBUTING.md for the full development guide, code conventions, and how to add new providers.