Use this guide if you want CapYap agents to run on local Ollama models for free.
- Download installer: https://ollama.com/download
Run in Terminal:
ollama serve
ollama pull llama3.1You can pull other models too (for example qwen2.5:7b).
From the repo root:
conda activate capyap
capyap startCapYap runs locally and the backend serves the app on localhost.
Desktop app note:
- Packaged desktop builds try to auto-start backend.
- If desktop cannot connect, run this command manually and keep it open:
capyap start --no-browser- Load a transcript source.
- Click
Start AI Analysis. - Select
ollamain the provider picker. - Wait for the Ollama status check.
- Click
Start Analyzingwhen status is ready.
Notes:
- No external cloud API key is required for Ollama.
- If Ollama is not detected, use the modal's
Install Ollamalink andRe-checkbutton.