Skip to content

fix: do not load stream into memory when decompressing/compressing#179

Open
elmaxe wants to merge 1 commit intomasterfrom
compression-memory-fix
Open

fix: do not load stream into memory when decompressing/compressing#179
elmaxe wants to merge 1 commit intomasterfrom
compression-memory-fix

Conversation

@elmaxe
Copy link
Copy Markdown
Member

@elmaxe elmaxe commented Mar 3, 2026

  • Do not load whole stream into memory when downloading attachments and decompressing it. Stream it into the Brotli stream instead
  • Do not compress the stream into a memory stream before uploading. Create a Pipe instead with a task that writes to the stream at the same time the Azure SDK reads from the pipe's read end concurrently.
  • Use CreateIfNotExistsAsync in UploadBlobAsync instead of the previous old gymnastics

Claude:

Download path (DecompressStreamAsync):                                                                             
  // Line ~222 in BlobStorageMessageAttachmentProvider.cs                                                            
  var memoryStream = new MemoryStream();                        // allocates in memory                               
  var brotliStream = new BrotliStream(source, CompressionMode.Decompress);                                           
  await brotliStream.CopyToAsync(memoryStream);                // decompresses EVERYTHING into RAM                   
  memoryStream.Position = 0;                                                                                         
  return memoryStream;

  Upload path (CompressStreamAsync) has the same pattern — the full compressed result goes into a MemoryStream before
   uploading.

  The initial blob read uses blob.OpenReadAsync() which is properly streamed, but that benefit is immediately lost
  once decompression kicks in.

  The fix is straightforward — return a BrotliStream wrapping the blob stream directly instead of copying into a
  MemoryStream:

  // Instead of copying into MemoryStream, just return the wrapping stream:
  var blobStream = await blob.OpenReadAsync(...);
  return new BrotliStream(blobStream, CompressionMode.Decompress); // consumer reads lazily

  This would make decompression fully streaming — the consumer reads decompressed chunks on demand without buffering
  the whole file. The upload path can similarly pipe directly from the source stream through BrotliStream into
  blob.UploadAsync(), since the Azure SDK accepts a Stream and handles chunked uploading itself

@elmaxe elmaxe requested review from NiklasArbin and mbaneryd March 3, 2026 08:32
@sonarqubecloud
Copy link
Copy Markdown

sonarqubecloud bot commented Mar 3, 2026

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant