Just tried disabling asset optimization entirely in the site’s settings and deployed a new version. It did not fix the problem, still having the same issue.
I don’t think compression of files is the issue here. Even if we serve it by compressing using Brotli further, in the end, it should be decoded as original by your browser.
Also, there’s a huge difference in the two files you’ve shared. The one uploaded to Netlify seems 12 MB while one on AWS seems 50+ MB. This is not some compression applied by Netlify, but two entirely different files.
I’d advise you to check the file that’s being uploaded and compare those two.
The details you’ve described are the exact problem. Netlify is brotli compressing an already compressed file, which has a very small effect on the size. When you click the link to download the file, your browser automatically decompresses as part of the download process. So when downloading the version served from Netlify, the browser decompresses it once, decompressing the little bit that the second compression pass accomplished. The 12.6MB file you end up with is still compressed — hence why it’s so much smaller.
When downloading the version from S3, the browser decompresses the original, highly compressed file that was uploaded, hence why the downloaded file is over 4x larger.
This is a little easier to see when analyzing text files instead of binary files. Here’s a JS file that we’ve been pre-compressing and uploading to S3, and are trying to move to same-origin hosting through Netlify
Note that when viewing the S3 version, it loads as expected and you can read the text, while the Netlify one has strange characters throughout. This is because it’s still compressed — the browser’s converting what is still binary data to text and ending up with that garbage.
You can further verify this by manually decompressing the JS file hosted on Netlify. Using the official brotli utility and running brotli -d WebGL.framework.js.br and then inspecting the output file you can see that it is the expected, fully decompressed JS. And if you compare it against the file hosted on S3 you’ll see that it is the exact same file, byte for byte.
I get a correct file served (at least doesn’t show the mangled characters like you showed)
(I had to set content-type to text/javascript or else it was being served as `application/octet-stream, thus downloading the file instead of displaying it in browser)
Note, to verify, I disabled Brotli in my browser (and even GZip to be sure), and tried to load the file from your Netlify link. Since browser was now not sending the accept-encoding header, Netlify would not serve Brotli and I was still getting the weird file - which makes me think something else has happened during the file upload.
I get a correct file served (at least doesn’t show the mangled characters like you showed)
Did you upload the compressed version or uncompressed version? When you download it from S3 through the browser it’s automatically decompressed. Did you download it with curl?
I disabled Brotli in my browser (and even GZip to be sure), and tried to load the file from your Netlify link. Since browser was now not sending the accept-encoding header, Netlify would not serve Brotli and I was still getting the weird file
Right, Netlify does not recompress the file in the absence of the accept-encoding header, but the browser also therefore does not decompress the response. And since the file was already compressed before uploading it, it’s served as is — compressed, without a response header instructing the browser to decompress it — showing as all the garbled content.
Hi, @Lou. Netlify isn’t corrupting anything. We send exactly the file you uploaded. As you uploaded it compressed, we send it in that format.
Netlify will never auto-detect the pre-compressed files as this is not an available feature at this time. Our support team can enter a feature request to support pre-compressing your files but that isn’t possible now. What you are doing is not supported currently.
All compression on the wire at Netlify happens automatically. In your case, you are uploading an already compressed file. Netlify is then sending that file as is and also brotli compressing it again. The browser doesn’t know there are two layers of compression. This is why the decompressed file is still compressed (because uploading pre-compressed files is not supported).
The solution here is to not pre-compress the files and to let our service do it automatically. If you want us to enter that feature request to support manually pre-compressing files, please let us know.
Thanks for your response and for understanding the situation. Yes, please enter the feature request to support manually pre-compressing files.
It’s probably not as simple as this, but a flag/setting/checkbox for something along the lines of "don’t compress files ending in .br, but still serve them as brotli compressed with a Content-Encoding header with a value of br " would solve this problem for us.
The automatic compression that Netlify is doing isn’t at a high-enough setting for this use case. We’re compressing these at a quality level of 11, and it takes quite a bit of time; it’s not practical to do at serve-time.
This is an existing feature request. I have passed your suggestion on to our Product team, who plans our roadmap and chooses what we’ll build in the future. In the end, they’ll choose what to build, but I would not expect them to build this. Why’s that? Read this thread to find out!
Thanks for your reply, thanks for passing the request along.
I read the thread. It feels more condescending than clarifying — most of the people here are building products themselves and understand the difficulties around prioritizing which customers to focus on and what to build — but I understand the intent of the explanation.
I hope the Netlify product team will see the pattern of the requests for this feature — applications using WebAssembly that depend on large data files — and consider the strategic benefit of supporting this rapidly growing segment of web development by building this feature.
For now my team will stick to serving these files through Cloudfront.