Very large variation in download speeds

Hi - I’m seeing very large variations in download speeds for the following URL

https://cuelang.org/play/main.wasm

My test involves looping the following command:

wget -S --header=“accept-encoding: gzip” https://cuelang.org/play/main.wasm

My internet connection is stable.

For example, request 55e310ef-a53d-4ca1-bdec-70b14b0ef731-54565686 took ~2 seconds, whereas 55e310ef-a53d-4ca1-bdec-70b14b0ef731-54485074 took ~30 seconds.

Please can someone help look into this?

Many thanks

Are both those IDs from the same location?

Yes, they are. My local machine.

Well then it’s strange. The only possible but not so correct explanation that I know of would be that Netlify is not optimised to serve files over 3 MB in size. Such files are not so easily cached in the CDN nodes and take time to transfer.

30 sec might be the realistic time for such a file, however, 2 sec one is the one I find strange.

I think, official Netlify support team would have a better answer as they can check those IDs.

1 Like

Hi, @myitcv. The 2 second request was after the 32 second request. The first request was slow because the file was not in the CDN cache. The second request was faster because the file was being served from memory by the CDN node cache.

We would normally expect to see a much faster download time for the file even when not cached though. The 32 second timing on the approximately 5 MB file (when compressed) is unusual even for an uncached file.

Hi @luke - thanks for the response.

The 2 second request was after the 32 second request. The first request was slow because the file was not in the CDN cache. The second request was faster because the file was being served from memory by the CDN node cache.

That would make sense except for the fact I do see very slow responses after very fast ones, despite the etag not having changed (i.e. file contents identical)

e.g. take the following requests from just now (all resulted in etag 0144f3faa7786907560fdbddda3195a5-ssl-df):

75534212-eaeb-40cb-8c7b-34f5f786eed9-43380143
b2c6d382-98a3-4f32-a3c3-69bbf567e207-145086539
29d79855-30c0-49e7-a989-497abd43c2e1-4348112
d5100e51-32bc-4a7e-9584-1ac105f41c6a-6098856
99c288e4-0b3c-4290-b01b-5a115a7612de-53499874
a606e1f4-1748-43f9-8f7a-2673bb32e943-73248945

All took ~32 secs. Which is somewhat suspicious in and of itself, because the timings are so similar.

Is there some sort of rate limiting going on here?

Thanks

Hey @myitcv,
No rate limiting. While the content did not change, the requests you shared were all served from different CDN nodes; the asset had not been cached in any of them yet. Here’s the list of nodes in case if it’s of interest:

cdn-reg-do-fra-10	1	
cdn-reg-do-fra-9	1	
cdn-reg-do-fra-6	1	
cdn-reg-do-fra-3	1	
cdn-reg-aws-fra-3	1	
cdn-reg-do-fra-8	1	 

One of the benefits of increased traffic to your site is that with each request served from a different CDN node, the request arrives faster for the next person who gets the asset from that same node cache, since it doesn’t have to travel all the way to our origin in the US west coast first.

Again, I know this isn’t ideal for your use case but did want to share the results of our digging.

Thanks for the update, @jen. Is there a way in which these CDN nodes can be primed with content, to avoid end users taking the hit?

Nothing systematically from our end however, after a deploy, you could possibly request the site from several locations x number of times to cache newly-deployed content.

There’s no fool-proof way to tackle this I’m afraid, given that nodes are added and removed from rotation quite often!

1 Like

While what Scott said is true - we don’t have a “re-prime cache after deploy” feature - it generally shouldn’t matter much for well designed sites that don’t use antipatterns in deploys around massive filename changes; most files in your site probably don’t change in every deploy. So, to mitigate the effects you’re experiencing, please check out this description of how to make the most of our CDN cache through careful deploy patterns:

1 Like