Essentially, Google Search Console can’t access my site or my sitemap. I have a robots.txt that should allow for indexing. DuckDuckGo and Bing can both index my site. I have the DNS setup through Netlify. I’m really just at a loss here - I didn’t expect a static site to have any trouble with Google’s Crawler.
Since when are you testing this? I’m asking because recently this happened:
I’m guessing it’s because of that? I am just guessing because I don’t have any info to troubleshoot. Google just said, it failed, without specifying any reason. Maybe it’s using outdated tech by some chance?
My site has been available for a while but I never checked it on Google so I didn’t notice - I don’t think it is recent though. Google Search Console has no information at all for my site, indicating it has never successfully indexed it.
When you say “it’s using outdated tech by some chance”, are you asking if Google’s GoogleBot is using outdated tech? If so, I doubt it, but I don’t think there’s much I could do if that was the case either…
Thanks for your response, I didn’t see anything I could do in Netlify to fix the issue. I’m using Netlify’s nameservers if that helps at all. All other search engines I’ve checked have had no issues indexing my site. I keep worrying that it has something to do with the “.bond” TLD - as I don’t see many of those on Google.
EDIT:
I have submitted a report to the Indexing & Crawling Team at Google - if I hear anything back I’ll respond in this thread.
Do let us know what the team says and we’d be happy to investigate. In any case, without any actual logs or error message, we are as stumped as you’re.
Considering your website is otherwise accessible to other users or even other search engines, it looks like it’s correctly configured on Netlify’s end, however any additional insights from their end would help us investigate this further.