We’ve added a robots.txt file to our repo, and deployed it to the Netlify server.
We did this because noticed that Netlify was grabbing the robots.txt file from our CMS (Which is hosted on a subdomain) because it couldn’t find this file on Netlify.
But since adding this file, the file doesn’t update, and instead still shows the contents of the one from our CMS subdomain.
How long will this take to expire/clear?
Weirdly, I updated other files on the server and they updated instantly, but this robots.txt file doesn’t. Why would that be?
I can see on one of the deploy previews that it’s correct also. So i’m a bit confused.
As far as delays go - there isn’t really a way to pinpoint exactly what happened here without something like an x-nf-request-id (more below), because we serve so much traffic that even knowing an approximate time window when the issue occurred barely narrows it down.
It really depends on exactly how your site is set up, also. You can learn a little more about how best to take advantage of caching here: