Google Keeps saying can't be crawled

We just deployed a new site on Netlify. But, Google keeps saying that the site is: “blocked by robots.txt”. We don’t even have a robots.txt file on the site. We have cleared the cache and rebuilt several times. But Google keeps saying we they are blocked. How do we fix this? Why is Netlify blocking the site from Google? We actually downloaded our deployed files and there is no robots.txt in there at all. Is Netlify adding this somewhere?

Hey @osseonews

No, my understanding is Netlify adds no such thing.

I don’t believe Netlify intentionally (or unintentionally) blocks search engines from crawling sites.

What is the name of the site? Are you using a custom domain?

Thanks. Cleared the cache on netlify during rebuild and I did a rebuild and seems to work now, so I’ll assume this is fixed. Was very strange, as I never had this google. Maybe was caused b/c we had password protection on the site prior to going live, so maybe Google had that cached? Google is a mystery.

This is possible.

Certainly and one I generally prefer not to try to unravel :slight_smile: