Skip to content

arcboxlabs/aigateway

Repository files navigation

⚡ AI Gateway

A protocol-faithful, multi-provider AI gateway written in Rust.

Route requests across OpenAI, Anthropic, Google Gemini, and the entire OpenAI-compatible ecosystem — with native wire types, streaming SSE, and zero lowest-common-denominator abstractions.

CI License: MIT Rust


Why AI Gateway?

Most multi-provider LLM wrappers force every provider into a single "universal" message format, silently dropping fields and features along the way. AI Gateway takes the opposite approach:

  • Protocol-faithful — Each provider crate models the upstream API exactly as documented. No fields are silently dropped, no semantics are lost.
  • Forward-compatible — All wire types carry #[serde(flatten)] extra to survive upstream API changes without recompilation.
  • Translation at the edge — Cross-provider mapping happens at the gateway layer via TryFrom/Into, not inside individual providers.
  • Streaming-first — Full SSE event fidelity for every provider.

Architecture

flowchart TB
    Client([Your Application])

    Client -->|Unified API| Gateway

    subgraph Gateway["⚡ AI Gateway"]
        direction TB
        Router[Request Router]
        Translate[Protocol Translation]
        Stream[SSE Streaming Engine]
        Router --> Translate --> Stream
    end

    subgraph Providers["Provider Crates"]
        direction LR
        OpenAI["aigw-openai"]
        Anthropic["aigw-anthropic"]
        Compat["aigw-openai-compat"]
        Gemini["aigw-gemini"]
    end

    Gateway --> OpenAI & Anthropic & Compat & Gemini

    OpenAI -->|Responses API\nChat Completions| OpenAI_API["OpenAI API"]
    Anthropic -->|Messages API| Anthropic_API["Anthropic API"]
    Compat -->|Chat Completions| Compat_APIs["Groq · Together\nvLLM · Fireworks\nPerplexity · LM Studio"]
    Gemini -->|generateContent| Gemini_API["Google Gemini API"]
Loading

Supported Providers

Provider Crate Chat Streaming Status
OpenAI aigw-openai 🚧 Active
Anthropic aigw-anthropic 🚧 Active
Google Gemini aigw-gemini 🏗️ Skeleton

OpenAI-Compatible Providers via aigw-openai-compat

Configure any OpenAI-compatible provider with a base_url + Quirks capability flags — no new crate needed.

Adding a new provider? If the API is OpenAI-compatible, add a Quirks config to aigw-openai-compat. Only create a new crate for providers with a distinct wire format.

Design Principles

No universal message type

Each provider crate defines its own native request/response types mirroring the upstream API. Translation between providers is handled via TryFrom/Into at the gateway layer.

Quirks-based compat

Third-party OpenAI-compatible providers declare their capabilities through a Quirks struct. Unsupported fields are stripped before sending, not silently ignored.

Quirks {
    supports_responses_api: false,
    supports_tool_choice: true,
    supports_parallel_tool_calls: false,
    supports_vision: true,
    supports_streaming: true,
}

Secrets never leak

API keys are stored as secrecy::SecretString — they never implement Debug, never appear in logs.

Quick Start

cargo build --workspace         # Build
cargo test --workspace          # Test
cargo clippy --workspace        # Lint

See CONTRIBUTING.md for the full development guide, code conventions, and how to add new providers.

License

MIT

About

Protocol-faithful, multi-provider AI gateway for OpenAI, Anthropic, Gemini, and OpenAI-compatible APIs — native wire types, streaming SSE, zero lowest-common-denominator abstractions.

Topics

Resources

Contributing

Stars

Watchers

Forks

Packages

 
 
 

Contributors

Languages