Skip to content

Add config option to disable stream usage stats for LLM APIs#3705

Open
utafrali wants to merge 1 commit intohigress-group:mainfrom
utafrali:fix/issue-3041-llm-api-stream-inlude-usage-stream-reque
Open

Add config option to disable stream usage stats for LLM APIs#3705
utafrali wants to merge 1 commit intohigress-group:mainfrom
utafrali:fix/issue-3041-llm-api-stream-inlude-usage-stream-reque

Conversation

@utafrali
Copy link
Copy Markdown

Ⅰ. Describe what this PR did

Added a config option to control whether stream requests to OpenAI-compatible LLM APIs automatically get include_usage added. When disabled, the request body stays unchanged. Otherwise, stream_options.include_usage gets added as before.

Ⅱ. Does this pull request fix one issue?

Fixes #3041

Ⅲ. Why don't you add test cases (unit test/integration test)?

Added tests covering both disabled and enabled states.

Ⅳ. Describe how to verify it

Send a stream request with the disable flag set and verify that stream_options.include_usage isn't added. With the flag off (default), confirm it still gets added.

Ⅴ. Special notes for reviews

Only affects OpenAI-compatible non-generic providers. Default behavior is unchanged.

Ⅵ. AI Coding Tool Usage Checklist (if applicable)

N/A

@CLAassistant
Copy link
Copy Markdown

CLA assistant check
Thank you for your submission! We really appreciate it. Like many open source projects, we ask that you sign our Contributor License Agreement before we can accept your contribution.
You have signed the CLA already but the status is still pending? Let us recheck it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

2 participants