Thanks for that, I can see the same via curl on the command line. It may be that the smallness of the index.html (711 bytes) means we don’t compress it, but I will ask the team in charge of that component of our CDN later this week when I meet with them to investigate for us and will follow up thereafter!
fool was bang on the money. We won’t compress assets smaller than 1kb. I hope this helps!
Hah perfect! Thank you for the investigation. I’ve just increased the page size and as magic, the site is now compressed.
It would be awesome to serve pre-compressed files, especially, if it’s inclusive of Brotli.
Please add me to that request list.
Are you not seeing our automatic brotli encoding work well, @jsphpndr? We don’t have any plans to add that feature in the foreseeable feature, so I’d rather get an understanding of where the system falls short today, rather than only adding your voice to a feature request that is likely to end up WONTFIX status.
I’m looking for less “I want this solution” and more “I am trying to solve this problem and don’t seem to be able to with Netlify”. I assert that “serving pre-built brotli files” is a solution to some problem, so that’s what I’m seeking clarity on
Problem: If I want to deploy PWA apps based on Wasm and three.js I have to maintain Nginx on a VPS as standard webhosting options don’t allow compression of the less commonly used file types (wasm, dll, glb, gltf). As uncompressed vs compressed takes page sizes from for e.g. 10Mb to 3Mb.
Netlify doesn’t seem to have the option to serve those file types, even if I deal with the compression if I understand this thread correctly
Netlify can certainly serve any file type, but indeed, our compression is applied only to things we are pretty sure will benefit and we don’t have that sureness on files like you mention. What DLL’s does your SPA need, out of curiosity?
I guess that might be behind peoples request, easy for us to test for our use case, not something Netlify will know at a population level. I know glb and gltf files in particular have a wide range of possible content so can vary widely on whether it’s worth compressing them or not.
For example dll’s see https://github.com/Thunderducky/inkwasm/tree/master/dlls, not my repo, but a few people in the ink community have been looking at using wasm to put one of the widely used tools online.
Ah, so am I reading this right: these aren’t file your SPA uses, but content you want to share? I say this since we don’t use DLL files during build or at runtime as far as I am aware
I’ll get some confirmation from our traffic engineering team that my assertion around “what file types we compress” and then we can get the use case ironed into a feature request if things aren’t working in the best way for your site. Not sure how soon I’ll get to talk with them (could be a week), but will follow up here once I have!
Yes if you are referring to a build process on deploy then they aren’t run, they have been generated once and are just part of the static content included with each build of app.
No if you are talking about running in browser as they are used by the app on a worker, in this case it’s a processor written in c# ,that has been packaged to run in browser as wasm/dll’s, that takes a particular from of markdown and returns json.
For scale in this case the dll compress from ~10MB to ~3MB.
For background on glb / gltf files see a similar request at Github pages (https://github.community/t/support-for-gzip-on-glb-3d-model-files/11004)
Thanks for taking an interest in this and look forward to hearing back in due course.
I would expect our CDN nodes to serve those files compressed. If that isn’t happening would you please send us a URL for a file at Netlify which should be compressed and it not?
dear reader turns out that if was doing a lot of work
Works for me for glb, wasm, dll’s so I’m happy!
As an aside yours is the fifth option I’ve tried across webhosting, managed VPS and other Netlify style providers and the only one that actually works, I’d definitely add that to your ads / support info.
Hi, @deepbluezen, we are always happy to research unusual behavior and thank you for letting us know those files are served gzipped for you.
There are edge cases to be aware of for compress file serving (which is otherwise automatic - you don’t need to manually gzip files before deploying).
For example, we don’t gzip things that are bigger when gzipped. It might sound odd for gzipping to make files larger but this often happens for PNGs. This is because they are already highly compressed so the gzip headers only make them bigger.
If you do see anything you think is unusual, please always feel free to make new posts and/or topics here. We will be happy to take a look. The same goes for questions too. It doesn’t have to be only for bugs. We love answering any questions there are about our service.
If you still think the responses are uncompressed, we need to be able to track the HTTP responses with this issue. The simplest way to do this is to send us the
x-nf-request-id header which we send with every HTTP response.
There more information about this header here:
If that header isn’t available for any reason, please send the information it replaces (or as many of these details as possible). Those details are:
- the complete URL requested
- the IP address for the system making the request
- the IP address for the CDN node that responded
- the day of the request
- the time of the request
- the timezone the time is in
@luke thanks for looking into this. It must have been a temporary issue. We ran a test yesterday on webpagetest.org and our js files were all coming up as uncompressed. Just ran it again and everything looks good now. Thanks again for the quick response.
@luke, @fool I have also problem with gzipping, I can’t find any file (which is bigger than 1kB), to be gzipped. Please let me know if I need to add something to _headers file, maybe I have somehow turned it off? Or maybe it’s because I use CloudFlare as my DNS provider? It happens when I use CloudFlare’s URL and direct Netlify’s link.
No need to set anything custom for this feature to work, and your DNS provider is irrelevant.
Your browser announces support for
Brotli compression (as well as gzip/deflate) in this request header from your screenshot:
Our service responds with a
Brotli compressed asset:
So I think things are working well. Let me know if you still disagree!
Sorry to pile in on this thread, but I just want to sanity check my observations in terms of asset compression, as I feel like I’m going a bit mad.
I have a two site setup where content is built and available on site 1 (wizardly-payne-3612b7) and served to a frontend on site 2 (clever-murdock-f2e04c). Site 1 is served via a proxy route on site 2 to mitigate CORS issues (some of the assets are downloaded via JS).
Here’s a request for a jpg image, which is considerably over the 1kb size mentioned above, and can be compressed to a considerably smaller size with gzip. I don’t believe this file is being compressed for transfer:
I tested this in Chrome, Safari and FireFox and saw the same behaviour, but I considered that there might be something weird in my local client setup causing Netlify to avoid compressing assets. However, I observed the same results for the same asset request using a remote machine on browserling.com:
I finally considered that the proxying, custom headers or some other technical implementation detail in my setup could be causing the issue, so I checked the site @teamgi had posted in August. I also found no compression was applied to this png despite it being over the 1kb minimum and despite gziping seemingly producing a smaller file in my tests.
Would you be able to help me understand if I’m doing anything wrong that’s causing Netlify to avoid compression of these assets please? I know that I can probably further optimise the size of the original assets in many cases (as well as using some of the modern image formats like webp and avif), but gzip or brotli encoding feels like a baseline performance gain that could improve the speed of download for these assets.
Hi, @andrewbridge, the JPEG itself is already an incredibly highly compressed file format.
There is a thought experiment in data compression theory which asks, “Can all files be compressed?”
The only logical answer is “no”. Taking the thought experiment to the extreme, if the answer was “yes” then all results of a file compression can themselves be compressed again. Again, if this was true (and it is not) we could keep compressing any file over and over until it was only a single bit in size. This is obviously not possible.
Clearly a single 0 or 1 in binary cannot represent all possible files. There is a limit to how much any one file can be compressed.
Why do I ramble about data compression theory?
To summarize, if you gzip this image with the best gzip compression - it gets bigger not smaller:
-rw-r--r-- 1 me people 28806 Nov 18 18:32 kings-disease-nas-xsmall-square.jpg -rw-r--r-- 1 me people 28838 Nov 18 18:32 kings-disease-nas-xsmall-square.jpg.gz
Gzipping adds 32 bytes to the file size in my tests (which were done using
Apple gzip 287.100.2 and the
--best option). We won’t compress a file if there is no gain. I’m not sure how you are gzipping if you are getting a different result. Again, JPEGs are very hard to compress precisely because that is what a JPEG is; it is already a highly compressed file.
It will be very common for already compressed files not to be further compressed by brotli or gzip for this reason.
If there are other questions about this, please let us know.