Skip to content

feat: add Telnyx as OpenAI-compatible provider#26371

Open
gbattistel wants to merge 11 commits intoBerriAI:litellm_internal_stagingfrom
team-telnyx:add-telnyx-v2
Open

feat: add Telnyx as OpenAI-compatible provider#26371
gbattistel wants to merge 11 commits intoBerriAI:litellm_internal_stagingfrom
team-telnyx:add-telnyx-v2

Conversation

@gbattistel
Copy link
Copy Markdown

Summary

Adds Telnyx as an OpenAI-compatible provider in LiteLLM via the providers.json configuration system.

Telnyx provides an OpenAI-compatible Inference API for LLM access to hosted models (Kimi-K2.6, GLM-5.1, MiniMax-M2.7, and more).

Changes

  • litellm/llms/openai_like/providers.json — Add Telnyx provider entry with base_url: https://api.telnyx.com/v2/ai and api_key_env: TELNYX_API_KEY
  • litellm/types/utils.py — Add TELNYX = "telnyx" to the LlmProviders enum
  • provider_endpoints_support.json — Add Telnyx endpoint support matrix (chat completions + embeddings)
  • tests/test_litellm/llms/openai_like/test_telnyx_provider.py — Unit tests for provider config, resolution, and enum
  • docs/my-website/docs/providers/telnyx.md — Provider documentation
  • docs/my-website/sidebars.js — Add Telnyx to providers sidebar

Usage

import os
import litellm

os.environ["TELNYX_API_KEY"] = "your-api-key"

response = litellm.completion(
    model="telnyx/moonshotai/Kimi-K2.6",
    messages=[{"role": "user", "content": "Hello!"}],
)

Testing

All 7 unit tests pass:

  • ✅ Provider exists in JSONProviderRegistry
  • ✅ Provider config (base_url, api_key_env)
  • ✅ Dynamic config generation
  • ✅ Provider resolution (telnyx/model-name → provider=telnyx)
  • ✅ Supported params
  • ✅ ProviderConfigManager integration
  • ✅ LlmProviders enum entry

References

@veria-ai
Copy link
Copy Markdown

veria-ai Bot commented Apr 24, 2026

Low: No security issues found

This PR adds Telnyx as a new OpenAI-compatible provider. The changes are limited to documentation, a JSON configuration entry (base URL and env var name for the API key), an enum addition, a sidebar entry, endpoint support metadata, and unit tests. No new code paths handle user input, authentication, or data processing — the provider is wired through the existing JSON-based provider system.


Status: 0 open
Risk: 1/10

Posted by Veria AI · 2026-04-24T00:21:32.220Z

@greptile-apps
Copy link
Copy Markdown
Contributor

greptile-apps Bot commented Apr 24, 2026

Greptile Summary

This PR registers Telnyx as an OpenAI-compatible provider by adding a providers.json entry, the corresponding LlmProviders.TELNYX enum value, endpoint support metadata, unit tests, and documentation. The implementation follows the established JSON-provider pattern used by other providers in this directory (e.g. veniceai, scaleway). Previously flagged issues (sidebar duplicate, fragile pytest import guard) have both been resolved.

Confidence Score: 5/5

Safe to merge — all changes are additive and follow established patterns with no backwards-incompatible modifications.

All findings are P2 style suggestions. The core implementation (providers.json, enum, endpoint support matrix) is correct and consistent with existing JSON-registered providers. Previously reported issues are addressed. No network calls in tests, no security concerns, no breaking changes.

No files require special attention.

Important Files Changed

Filename Overview
litellm/llms/openai_like/providers.json Adds telnyx entry with correct base_url and api_key_env; consistent with existing JSON-provider pattern
litellm/types/utils.py Adds TELNYX = "telnyx" to LlmProviders enum; clean placement and consistent with other entries
provider_endpoints_support.json Adds Telnyx endpoint support matrix with chat_completions and embeddings enabled; all other endpoints correctly set to false
tests/test_litellm/llms/openai_like/test_telnyx_provider.py Unit tests cover provider registration, config values, resolution, and enum; no real network calls; uses sys.path.insert workspace hack and an if __name__ == "__main__" runner block
docs/my-website/docs/providers/telnyx.md Provider documentation with accurate base URL, model table, and usage examples; the OpenAI SDK example correctly uses the /openai-suffixed endpoint per Telnyx's quickstart docs
docs/my-website/sidebars.js Adds a single providers/telnyx entry in alphabetical order; previously reported triple-duplicate is resolved

Sequence Diagram

sequenceDiagram
    participant User
    participant LiteLLM
    participant JSONProviderRegistry
    participant TelnyxAPI

    User->>LiteLLM: completion(model="telnyx/moonshotai/Kimi-K2.6")
    LiteLLM->>JSONProviderRegistry: exists("telnyx")?
    JSONProviderRegistry-->>LiteLLM: true (providers.json entry)
    LiteLLM->>JSONProviderRegistry: get("telnyx")
    JSONProviderRegistry-->>LiteLLM: {base_url, api_key_env}
    LiteLLM->>LiteLLM: resolve api_base = "https://api.telnyx.com/v2/ai"
    LiteLLM->>LiteLLM: resolve api_key from TELNYX_API_KEY env
    LiteLLM->>TelnyxAPI: POST /v2/ai/chat/completions
    TelnyxAPI-->>LiteLLM: OpenAI-compatible response
    LiteLLM-->>User: ModelResponse
Loading

Reviews (3): Last reviewed commit: "fix: remove fragile pytest import guard ..." | Re-trigger Greptile

Comment thread docs/my-website/sidebars.js Outdated
Comment thread tests/test_litellm/llms/openai_like/test_telnyx_provider.py Outdated
@codecov
Copy link
Copy Markdown

codecov Bot commented Apr 24, 2026

Codecov Report

✅ All modified and coverable lines are covered by tests.

📢 Thoughts on this report? Let us know!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant