Keep Old Deploy Files?

Every time I build my Nuxt site, it seems like all of my existing files get deleted and removed from the CDN (right away). Currently if someone is on my site and I deploy, the site will blow up when someone clicks a link because it seems the files for that deployment are gone.

It would be great if there was a way for Netlify to keep files X many days old so deploys don’t screw up users.

Yes, I could do something different with hashes, but even that wouldn’t help. If someone went to the blog list page, and I updated 1 blog post while they were on it, it shouldn’t blow up when they click on THAT blog page (since the hash filename would be different)

In AWS we just use versioning to ensure the files that were deployed are always there, no matter what.
(/public/1.2/blah.js) so that when people are a site, they can continue to browse the ‘old’ version of the site without it exploding.

I don’t think Netlify has that, do they?

Hey @hecktarzuli,

This sounds most peculiar. If an asset isn’t available at the CDN, we’ll return to origin and fetch the file, caching it in the process (so long as it’s under 4MB).

Netlify makes use of atomic deploys which are covered under our Netlify Edge product page. There’s no reason why, systematically, your live site should fail. We don’t delete or remove old deploys. There’s more at play here, that’s for sure!

Can you provide a deploy log and/or x-nf-request-id from the request headers when you encounter this? Are you proxying your requests? Are there service workers at play? Do you have another tab with the site open when you refresh? For this, perhaps a recording would be super useful.

Great info! I’ll dig more tonight.

I’m on the free plan, does that still use atomic deploys?

I just re-created the issue. My blog pages change every deploy (something I can fix), but for sure you are removing files from the CDN and the build from old deploys as I watched a file that existed move to a 404 which blew up my site.

Again, I’m on the free plan.

Aha! Congratulations, you’ve found the anomaly. :partying_face:

Deploy hashes in file name, when used in conjunction with or a custom domain, will cause the page to 404. You should really avoid the use of hashes.

To answer your question – yes, atomic deploys are fundamental to what we do :smile: and they’re most certainly part of the Starter tier. In short, each time you deploy, the cache is invalidated and the latest content will be served. So, this makes using this hashes redundant!

This flies in the face of most conventional wisdom about building websites. Hashing is important because it allows the browser to immediately know that different JS is required (that has not been cached in the past) and to fetch it.

It also means that if a file doesn’t change between builds, the browser does not need to fetch it again.

This is the Jamstack! Forget convention :wink: