Is your feature request related to a problem? Please describe.
Using local LLMs instead than OpenAI API as backend
Describe the solution you'd like
Create a DemoGPT agent from a locally available model (ideally, a quantized Llama2 model via llama-cpp-python
Describe alternatives you've considered
If that' s already possible, a guide or some instruction about how to do it would be greatly appreciated!
Additional context
NA
Is your feature request related to a problem? Please describe.
Using local LLMs instead than OpenAI API as backend
Describe the solution you'd like
Create a DemoGPT agent from a locally available model (ideally, a quantized Llama2 model via llama-cpp-python
Describe alternatives you've considered
If that' s already possible, a guide or some instruction about how to do it would be greatly appreciated!
Additional context
NA