Currently content is only uploaded as soon as a request comes in. This can add additional delay. An option to upload the content and pre-populate the server's cache might be a good idea for use cases where it's certain that a request will come in, to reduce latency. The effect would be noticable as upload speeds are usually far worse than download speeds and any second that can be spend uploading before a request significantly reduces the amount of time it takes for the request to complete.
There are many ways to do this:
- Require the cache that is used to expose an API to pre-populate the cache (likely few caches support this)
- Simply send an HTTP request on the server (to ensure lowest latency) to trigger the upload immediately. To reduce CPU load the data should be immediately discarded, it should only trigger an upload by the client and the cache to store the data, without sending a full response. Can an HTTP HEAD request do this? Or can we send an HTTP GET and abort immediately? Sending an HTTP request does still introduce delay though because the server needs to forward the request through the websocket to the client and the client responds only then. It would be fastest, if the client could start uploading data immediately
- Create a custom cache implementation which supports server-side pre-population. That's a lot of work, but having one cache implementation that satisfies all cache requirements can be better than letting the user have to find their own suitable cache
This issue is more of a discussion and marked as low priority because:
- Small files will be uploaded fast anyway, it will effectively only be a difference of hundreds of milliseconds
- Large files won't benefit too much from a few seconds of earlier uploading, especially with slow upload speeds
- This is basically only relevant in two cases:
- For larger content for which requests come in later than within a few seconds and earlier than
T+C, where T is the time point at which the data was fully uploaded and committed to the cache and C is the cache duration
- For smaller to medium size content that needs to be transmitted as fast as possible, e.g. in contexts where fast transfers are crucial, but peer-to-peer communication only has limited bandwidth (e.g. for the generated URL)
Currently content is only uploaded as soon as a request comes in. This can add additional delay. An option to upload the content and pre-populate the server's cache might be a good idea for use cases where it's certain that a request will come in, to reduce latency. The effect would be noticable as upload speeds are usually far worse than download speeds and any second that can be spend uploading before a request significantly reduces the amount of time it takes for the request to complete.
There are many ways to do this:
This issue is more of a discussion and marked as low priority because:
T+C, whereTis the time point at which the data was fully uploaded and committed to the cache andCis the cache duration