This is a simple deploy from github. Just an html, css, and a few js files.
It appears when I check the site in the developer tools under Network, I am seeing
cache-control: public, max-age=0, must-revalidate
This would be fine for most of my files, but there is one that is 3mb (15 on disk) and it never changes. Do I need to set up a _headers file if I want any of my files to have a longer caching policy? And do I need to disable Asset Optimization on the whole site in order to do this? It’s important that at least this one large file be gzipped.
I think I am getting confused by the different types of caching. I’m quite new at this. Any help you could give to point me in the right direction would be appreciated.
You can change the cache headers if you’re sure that the file is never going to change, or at least not for long periods of time. Basically, if you don’t wish for that file to be updated even once you push a new deploy, you can use cache headers.
Asset Optimization has nothing to do with caching policies. If that file can be compressed using BR or GZip, it will be regardless of if you’re using Asset Optimization or not.
Thanks! I think I don’t deeply understand caching and that is what is tripping me up.
So my choice is either
the file never caches in the browser (as it is now) or
if I set a policy, say to keep it for a month, then even if I change the file and re-deploy it won’t update for the users who have already cached it.
Correct? I guess it would take some kind of magic otherwise. There is no automatic system to change the name of the file automatically when the file content changes? Like with a hash or something? I thought I had read about that somewhere.
So what it was talking about in the Instant Cache Invalidation article (which I clearly didn’t comprehend!) was just caching on the CDN?
Yes, your understanding is correct. However, the CDN tells the browsers to load the file from browser cache (by sending a 304 status), so you should be able to see the cache being used. There’s an ongoing bug probably that’s not always working, you can read more about it here:
About the latter, you can indeed use cache-busting techniques like changing file name or sending a query parameter - but that would just ask the browsers to look for the file again. In our CDN, the same file would be cached and served.
About instant cache invalidation, yes, we’re majorly taking about removing the cache at our CDN (as that’s the one we can control) and since we ask the browsers to check for the file every time, we can always serve the latest content as long as our CDN has the latest one.
It shows that the file is being loaded from the local disk cache (with a 200) and it is not being sent from Netlify.
There is also a second way to see if it is a cached response or a response coming from Netlify:
Does the x-nf-request-id header change?
If the header is staying the same, that is a local disk cache serving the file.
Netlify will always change the x-nf-request-id header. We will never send the same x-nf-request-id header twice. If you see the same header used over and over again (which I do for this URL) then the responses must be coming from the local disk.
To summarize, this file is using the new cache-control header and it is working when I test.
If there are other questions or concerns, please let us know.
Thank you so much! I see now that FireFox does not do such a good job showing what is coming from cache or not. That’s what I was using. Chrome seems to do a better job.
Thank you both to @hrishikesh and @luke. I wish I could mark you both as the solution, but unfortunately Discourse only allows one. You both gave excellent information and I’m really impressed with Netlify.