Is your feature request related to a problem? Please describe.
Describe the bug
When building a site with a large number of dynamic route pages (generated via .paths.ts),
the build process crashes with a JavaScript heap out of memory error during the "rendering pages" phase.
This happens even with NODE_OPTIONS=--max-old-space-size=8192 (8 GB heap).
Reproduction
- VitePress version:
2.0.0-alpha.16
- **Node.js version:v22.21.1
- OS: Windows 11
Route setup:
- 20 locale directories, each with a
[source]-[target].paths.ts
- Each paths file generates ~1,300 route pairs
- Total pages: ~26,000
// [source]-[target].paths.ts (simplified)
export default {
paths: () =>
generatePairs(list).map(([source, target]) => ({
params: { source, target, ...meta },
})),
}
Error output:
✓ building client + server bundles...
⠧ rendering pages...
<--- Last few GCs --->
[...] Mark-Compact 8012.2 (8226.2) -> 7997.8 MB ...
FATAL ERROR: Ineffective mark-compacts near heap limit
Allocation failed - JavaScript heap out of memory
Expected behavior
Either:
- VitePress streams/batches page rendering internally to keep memory bounded, or
- Provides an official API / config option (e.g.
build.pagesPerChunk) to limit how many pages are held in memory at once during SSR rendering.
Current workaround
I implemented a manual chunked-build script that sets BATCH_INDEX / BATCH_TOTAL env vars,
filters .paths() output per batch, and merges the output dirs afterward.
This works but is fragile and requires maintaining a custom script.
Questions
- Is there a planned solution for large-scale static generation in VitePress?
- Would it be feasible to add a streaming/incremental rendering mode (similar to how Nuxt/Next handle ISR)?
- Is there a recommended way to handle sites with 10,000+ dynamic pages today?
Describe the solution you'd like
Add a build.maxPagesPerChunk (or similar) config option that makes VitePress
render pages in batches internally, instead of loading all SSR-rendered pages
into memory at once.
Ideal behavior:
- Pages are rendered in streaming/incremental batches (e.g., 500 at a time)
- Memory usage stays bounded regardless of total page count
- No change required to user config or
.paths.ts files
- Works transparently with dynamic routes, multi-locale sites, and sitemaps
Example config (proposed API):
// .vitepress/config.ts
export default defineConfig({
build: {
maxPagesPerChunk: 1000, // render at most N pages per pass
},
})
Describe alternatives you've considered
No response
Additional context
No response
Validations
Is your feature request related to a problem? Please describe.
Describe the bug
When building a site with a large number of dynamic route pages (generated via
.paths.ts),the build process crashes with a JavaScript heap out of memory error during the "rendering pages" phase.
This happens even with
NODE_OPTIONS=--max-old-space-size=8192(8 GB heap).Reproduction
2.0.0-alpha.16Route setup:
[source]-[target].paths.tsError output:
Expected behavior
Either:
build.pagesPerChunk) to limit how many pages are held in memory at once during SSR rendering.Current workaround
I implemented a manual chunked-build script that sets
BATCH_INDEX/BATCH_TOTALenv vars,filters
.paths()output per batch, and merges the output dirs afterward.This works but is fragile and requires maintaining a custom script.
Questions
Describe the solution you'd like
Add a
build.maxPagesPerChunk(or similar) config option that makes VitePressrender pages in batches internally, instead of loading all SSR-rendered pages
into memory at once.
Ideal behavior:
.paths.tsfilesExample config (proposed API):
// .vitepress/config.ts
export default defineConfig({
build: {
maxPagesPerChunk: 1000, // render at most N pages per pass
},
})
Describe alternatives you've considered
No response
Additional context
No response
Validations