Summary
Droid CLI crashes during BYOK streaming for a custom generic-chat-completion-api model after the request has already reached first token.
In my case, the model is claude-opus-4-6 routed through AgentRouter, but the failure appears to be in Droid's stream/response handling rather than an early auth/config issue.
The user-visible failure is:
TypeError: null is not an object (evaluating 'H.usage')
MetaError: Exec failed
Environment
- Droid CLI:
0.100.0
- OS: macOS
darwin 25.4.0
- Terminal: Warp
- Mode:
droid exec
- Model provider: BYOK custom model with
provider: "generic-chat-completion-api"
Config shape
I used a temporary settings overlay rather than editing my real ~/.factory/settings.json.
{
"customModels": [
{
"model": "claude-opus-4-6",
"displayName": "AgentRouter-Claude-Opus-4-6",
"baseUrl": "https://agentrouter.org/v1",
"apiKey": "${AGENT_ROUTER_TOKEN}",
"provider": "generic-chat-completion-api",
"maxOutputTokens": 64000
}
]
}
Invocation:
droid exec \
--settings /path/to/temp-settings.json \
--model 'custom:AgentRouter-Claude-Opus-4-6-0' \
--output-format debug \
'Reply with the single word OK.'
Reproduction
- Configure a BYOK custom model using
provider: "generic-chat-completion-api"
- Point it at an OpenAI-compatible endpoint that accepts the request and starts streaming back content
- Run
droid exec against that custom model
- Observe that Droid gets past request start and records time-to-first-token
- Then Droid crashes with
H.usage
Expected behavior
If the upstream endpoint is valid enough for Droid to start streaming tokens, Droid should either:
- complete successfully, or
- surface a normal provider error
It should not crash internally with:
null is not an object (evaluating 'H.usage')
Actual behavior
droid exec exits with:
{"type":"error","source":"agent_loop","message":"null is not an object (evaluating 'H.usage')"}
{"type":"error","source":"cli","message":"Exec failed"}
Evidence that the request gets far enough to stream
From ~/.factory/logs/droid-log-single.log:
[2026-04-20T22:18:49.079Z] INFO: [LLM] sendMessage | Context: {"messageThreadLength":2,"toolCount":9,"modelId":"claude-opus-4-6",...}
[2026-04-20T22:18:52.593Z] INFO: [metrics_log_chat_client_time_to_first_token] | Context: {"modelId":"claude-opus-4-6","modelProvider":"generic-chat-completion-api","apiProvider":"fireworks","value":3.513,...}
[2026-04-20T22:18:52.677Z] WARN: [useLLMStreaming] LLM error | Context: {"error":{"name":"TypeError","message":"null is not an object (evaluating 'H.usage')","stack":"TypeError: null is not an object (evaluating 'H.usage')'..."},"attempt":1,"modelId":"claude-opus-4-6",...}
[2026-04-20T22:20:31.219Z] ERROR: [Agent] runAgent error
TypeError: null is not an object (evaluating 'H.usage')
at X$L (src/models/chunkProcessing.ts:903:7)
at <anonymous> (src/hooks/createLLMStreamingCore.ts:1511:9)
And the debug output from droid exec --output-format debug ends with:
{"type":"system","subtype":"init",...,"model":"custom:AgentRouter-Claude-Opus-4-6-0","reasoning_effort":"none"}
{"type":"message","role":"user",...,"text":"Reply with the single word OK."}
{"type":"error","source":"agent_loop","message":"null is not an object (evaluating 'H.usage')"}
{"type":"error","source":"cli","message":"Exec failed"}
Notes
If useful, I can provide a reduced reproduction against the same endpoint shape without the vendor-specific token.
Co-Authored-By: ForgeCode noreply@forgecode.dev
Summary
Droid CLI crashes during BYOK streaming for a custom
generic-chat-completion-apimodel after the request has already reached first token.In my case, the model is
claude-opus-4-6routed through AgentRouter, but the failure appears to be in Droid's stream/response handling rather than an early auth/config issue.The user-visible failure is:
Environment
0.100.0darwin 25.4.0droid execprovider: "generic-chat-completion-api"Config shape
I used a temporary settings overlay rather than editing my real
~/.factory/settings.json.{ "customModels": [ { "model": "claude-opus-4-6", "displayName": "AgentRouter-Claude-Opus-4-6", "baseUrl": "https://agentrouter.org/v1", "apiKey": "${AGENT_ROUTER_TOKEN}", "provider": "generic-chat-completion-api", "maxOutputTokens": 64000 } ] }Invocation:
Reproduction
provider: "generic-chat-completion-api"droid execagainst that custom modelH.usageExpected behavior
If the upstream endpoint is valid enough for Droid to start streaming tokens, Droid should either:
It should not crash internally with:
Actual behavior
droid execexits with:Evidence that the request gets far enough to stream
From
~/.factory/logs/droid-log-single.log:And the debug output from
droid exec --output-format debugends with:Notes
h.typestreaming parser issue in v0.99.0 regression:undefined is not an object (evaluating 'h.type')— streaming chunk parser crashes on tool-call output index gaps #937, although it feels related in that it looks like a null-safety problem in chunk processing.usageexists on some streamed/final object where it may actually benullor absent.If useful, I can provide a reduced reproduction against the same endpoint shape without the vendor-specific token.
Co-Authored-By: ForgeCode noreply@forgecode.dev