Skip to content

Fix reasoning model issue by updating chat API to v2#287

Open
bitun123 wants to merge 2 commits intocohere-ai:mainfrom
bitun123:features/version-modify
Open

Fix reasoning model issue by updating chat API to v2#287
bitun123 wants to merge 2 commits intocohere-ai:mainfrom
bitun123:features/version-modify

Conversation

@bitun123
Copy link
Copy Markdown

@bitun123 bitun123 commented Apr 4, 2026

🛠 Fix: Update Chat API to v2 for Reasoning Model Support

📌 Problem

The v1/chat endpoint does not support reasoning models, causing issues when trying to use them.

✅ Solution

  • Updated endpoint from v1/chatv2/chat
  • Ensures proper compatibility with reasoning models

🔍 Changes Made

  • Modified API endpoint in fetch request
  • No changes to existing logic or UI

🚀 Impact

  • Fixes reasoning model usage
  • Improves API compatibility
  • No breaking changes

🧪 Testing

  • Verified reasoning model works correctly with v2
  • No regressions observed

Note

Medium Risk
Low code complexity but changes the network endpoint used by chatStream, which could alter response shape/behavior or break users if their backend expects v1 semantics.

Overview
Switches the SDK’s streamed chat request (chatStream) from hitting v1/chat to v2/chat to support reasoning models.

Adds a short README note documenting the endpoint version update and the rationale.

Reviewed by Cursor Bugbot for commit 98fc5e4. Bugbot is set up for automated code reviews on this repo. Configure here.

Copy link
Copy Markdown

@cursor cursor bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Cursor Bugbot has reviewed your changes and found 1 potential issue.

Fix All in Cursor

Reviewed by Cursor Bugbot for commit 98fc5e4. Configure here.

(await core.Supplier.get(this._options.environment)) ??
environments.CohereEnvironment.Production,
"v1/chat",
"v2/chat",
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

V1 request/response format sent to V2 endpoint

High Severity

The endpoint URL was changed from v1/chat to v2/chat, but the request type (ChatStreamRequest), request serializer (serializers.ChatStreamRequest), response parser (serializers.StreamedChatResponse), streaming response type ("streaming" vs "sse"), and event shape ("json" with "\n" terminator vs "sse" with "[DONE]" terminator) all remain v1-format. The v1 request uses a message string field, while v2 expects a messages array with role/content objects — these are completely incompatible schemas. This will cause all chatStream calls to fail at the API level.

Additional Locations (2)
Fix in Cursor Fix in Web

Reviewed by Cursor Bugbot for commit 98fc5e4. Configure here.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant