Add LLM config option for indexing stage and query stage #2889
tweinberger-lei
started this conversation in
Ideas
Replies: 1 comment
-
|
Similar to #2848 Looking forward to v1.5 :-) |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Documentation recommends using only an instruct LLM (non-reasoning) for the indexing stage. However, using a reasoning LLM for the query stage could improve query results. Therefore, it would be great to support configuring different LLMs for the indexing and query stages in the config file, for example:
INDEXING_LLM_BINDING=openai
INDEXING_LLM_BINDING_HOST=https://api.openai.com/v1
INDEXING_LLM_BINDING_API_KEY=your_api_key
INDEXING_LLM_MODEL=gpt-5.4
QUERY_LLM_BINDING=openai
QUERY_LLM_BINDING_HOST=https://api.openai.com/v1
QUERY_LLM_BINDING_API_KEY=your_api_key
QUERY_LLM_MODEL=gpt-5-mini
Thanks!
Beta Was this translation helpful? Give feedback.
All reactions