Slow download speed for 23 MB static asset (fonts)

Here is another example that you can try to correlate with your logs.

The request id for convenience: 8ef36176-3e42-473f-96b9-5d8044a39f4a-4441296

Its hard for me to understand a world where serving a 186 kb file in 5 seconds is somehow acceptable performance.

Since our JS files of similar size are served at a more acceptable speed, it would almost seem like maybe you are not caching files of certain types (or perhaps caching only files of certain types), by somehow assuming that in our case .json files should not be cached.

Welcome to the thread, @msanchez!

I think both of you are showing the compressed size of your files. Our CDN internally transfers, and stores in cache, uncompressed files in cache; that is the size that matters for performance checking from our internal logs. I understand that your browser downloaded many fewer bytes, but in transferring the file, the actual size on disk matters. That is the number one factor in how fast we can serve uncached contents - how big the uncompressed asset is. What started this thread - a 23MByte font file - will never work well on our CDN as we have not designed it with that use case in mind.

That said, I also do not think that an average response time of 5s for interactive content is acceptable; however, since our CDN caches opportunistically, so if the asset is frequently used on a CDN node, it will be served in milliseconds, not seconds. If it is not in cache, it may take some time to transfer. We aim for that time to be as small as possible, but smaller assets are transferred faster.

It is an interesting assertion that json files are cached differently and while I would be surprised to hear it, nonetheless I have posed the question to the edge networking team, and will let you know what I find out.

Thanks for spending more time on this. As I have to look at alternatives I deployed the site to Heroku also. Without any CDN, the download speeds are much better. Trying to understand why I noticed that Netlify serve the files using Brotli, while our Heroku setup uses gzip.

That lead me to consider that perhaps the performance loss is actually from Brotli encoding on your end. Brotli at level 11 is much slower to compress and if only cache the uncompressed files, that might explain it all. However 5 seconds to compress a 1mb file is still very slow. But I guess you are perhaps limiting cpu time fir those processes on your edges.

As an update we are off Netlify and on to Heroku. Serving the sames files directly from Heroku dynos without any CDN is much much faster and much more consistent. With CloudFlare in front we also get better coverage.

While I am sure a single customer like us is of little importance, I would really encourage Netlify to look deeply into these issues. Searching the forums it’s very clear that many users have been noticing these issues and have provided clear evidence of the problem, but they are generally dismissed as edge-cases by support personnel.

The issue is real and it matters. It’s hard to know any details, but given the pattern in the support issues I suspect it is related to encoding of highly compressible files. I have tried disabling Brotli in my browser using an extension, and that made things markedly better, albeit still not as fast I as I would expect.

Thanks for that thorough diagnosis, @holm! Sorry we didn’t get any resolution here - team asserts that no different handling of different file types is happening, but as you say, some things are easier to compress than others.

Most of the problem is in TRANSFER time internally, but that couldn’t help.