-
|
Hi! I'm thinking of giving litData a try. It seems that many usage examples are designed for working with small files. Do you have a recommendation to work with medical images in the order of hundreds of MB? I'd probably want to download one (large) image volume and 1+ (small) segmentation volume and extract multiple subvolumes locally, each of which will be a training instance. |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 1 reply
-
|
Hi @fepegar — while litData typically recommends ~64 MB chunks for optimal streaming, larger chunks (500 MB–1 GB) work fine too, especially if download speed isn’t a bottleneck. |
Beta Was this translation helpful? Give feedback.
Hi @fepegar — while litData typically recommends ~64 MB chunks for optimal streaming, larger chunks (500 MB–1 GB) work fine too, especially if download speed isn’t a bottleneck.