Replies: 2 comments
-
|
Hi @yottoya , thanks for asking! I'm wondering how large your codebase is? When running The real way to make it faster is avoid starting a new process after each edit. e.g. you may consider running cocoindex within the same process as your MCP, and you call the update API after update (or before making queries). It usually can finish within a second unless you have >100K source files. Besides, I just created a feature request to natively support change notifications on |
Beta Was this translation helpful? Give feedback.
-
|
@georgeh0 thoughts? |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Hello! I wanted to ask about a feature that I might've overlooked in the Docs so I thought I'd confirm it here before trying a different solution.
So I have created a flow for my codebase and it works amazing with the MCP server I built to read from the latest context of my projects via Opencode.
However, the issue I'm running into is when editing.
When editing, I created a tool that at the end of the edit, runs:
This way, after each edit that the LLM does, we force an update to the db to ensure we always have the latest context.
However, this is a very slow process because after each edit, it has to run this and wait for this to finish and when it's complete then it moves on to the next edit task.
But the this sub process takes so long to run that I am wondering is there a better way to have live updates only when specific files have been modified or added.
This way the LLM can make edits instantaneously, we wouldn't have to run this subprocess, after the edits are applied that cocoindex recognizes this and re-indexes everything immediately. Is there such a feature?
Or must I imeplement my own (I was thinking via the watchdog package)?
Beta Was this translation helpful? Give feedback.
All reactions