Fix neighbor cache not being cleared when search radius increases#464
Fix neighbor cache not being cleared when search radius increases#464atharrva01 wants to merge 2 commits intoBioDynaMo:masterfrom
Conversation
Signed-off-by: atharrva01 <atharvaborade568@gmail.com>
|
hi @johnpapad24 this fixes a bug where the neighbor cache wasn't cleared before being rebuilt with a larger search radius, causing duplicate entries. This led to neighbors being processed multiple times and producing silent incorrect simulation results. |
|
Hi @atharrva01, Thanks for the contribution! I’ll review the changes and get back to you. In the meantime: the CI issue that previously prevented Could you please update your PR branch with the latest |
|
Thanks for the update @sportokalidis! I've merged the latest Let me know if you notice anything else that needs adjustment. |
Summary
I found a bug in the neighbor caching code inside
ForEachNeighbor. When you havecache_neighborsturned on, the cache gets reused across multiple calls. That's fine until someone requests a bigger search radius than before—the cache becomes stale and needs to be rebuilt.The problem is that the code wasn't clearing out the old cache entries before adding the new ones. So you'd end up with the old neighbors plus the new ones, which means duplicates.
Impact
If the cache has duplicate entries, the same neighbor can get processed multiple times in later queries. This can cause:
The tricky part is nothing actually crashes—it just quietly produces wrong results.
Fix
Just clear the cache before refilling it:
cached_squared_search_radius_ = squared_radius; neighbor_cache_.clear(); // wipe old dataNow the cache always has the right neighbors when it gets rebuilt.