Sitemap.xml Couldn't Fetch

From the past 3 days, sitemap.xml couldn’t fetch on google search console. No error on the sitemap file because when I submit on bing that’s success.
If the problem because I used starter plan, I can upgrade to Pro.
Please need help.

Hi, @web44dmin, we are not able to troubleshoot Google Search Console but we can troubleshoot the site itself.

Normally, we would ask you for the x-nf-request-id for the bad response. There more information about this header here:

Now, you probably cannot get that information from the Google Search Console. So, if that header isn’t available (and it probably isn’t), please send the information it replaces (or as many of these details as possible). Those details are:

  • the complete URL requested
  • the IP address for the system making the request
  • the IP address for the CDN node that responded
  • the day of the request
  • the time of the request
  • the timezone the time is in

I’m guessing only the following details will be available:

  • the complete URL requested
  • the day of the request
  • the time of the request
  • the timezone the time is in

Would you please send us those details?

Hi, @web44dmin, I did research the request/response for this x-nf-request-id.

Note, this URL isn’t the site map URL so I’m not sure how researching this x-nf-request-id will resolve the site map issue.

The x-nf-request-id above is for a successful HTTP request to:

https://cookingclassy.netlify.app/

This was a successful response and the contents of index.html was sent. This x-nf-request-id doesn’t reference sitemap.xml in any way.

Did you have a question about this request? Do you have the x-nf-request-id for a HTTP response for sitemap.xml?

By the way, the support guide states that sending the x-nf-request-id as a screenshot is the wrong way to send these to us. Going forward please send us the x-nf-request-id as text, not as a screenshot.

I’m sorry send you wrong x-nf-request
here is the correct
accept-ranges: bytes

age: 0

cache-control: public, max-age=0, must-revalidate

content-encoding: br

content-type: application/xml

date: Wed, 15 Jul 2020 05:43:13 GMT

etag: “6ddfc45c95e7f9c9d338793c3d7e5d7d-ssl-df”

server: Netlify

strict-transport-security: max-age=31536000

vary: Accept-Encoding

x-firefox-spdy: h2

x-nf-request-id: 752bb670-35d9-4fc6-9c21-816520b13a49-2763156

200 OK

Hi, @web44dmin, I’m not seeing 752bb670-35d9-4fc6-9c21-816520b13a49-2763156 as a 200 response. This was a 304 response in our logging.

This indicates the etag above was sent because the browser had a locally cached copy of this file. Our CDN node confirmed the content was up to date (based on the etag) and therefore it didn’t resend the file, only the 304 status to confirm to the browser to use the cached copy.

So far, there are no indications of failures. Are you sending x-nf-request-id for failing requests? The ID is specific to a request not a URL.

I’m asking because all the checks so far are for successful responses and I thought we were trying to identify failing responses.

Hi i have the same problem in my site brain beats

This is the response header from edge browser:

Request URL:
https://brain-beats.netlify.app/sitemap.xml
Request Method:
GET
Status Code:
200 (from service worker)
Referrer Policy:
strict-origin-when-cross-origin
Accept-Ranges:
bytes
Age:
1
Cache-Control:
public, max-age=0, must-revalidate
Content-Encoding:
br
Content-Length:
377
Content-Type:
application/xml
Date:
Mon, 24 Jul 2023 05:00:32 GMT
Etag:
"e974be414cdb63ef293b25a566778c39-ssl-df"
Server:
Netlify
Strict-Transport-Security:
max-age=31536000; includeSubDomains; preload
Vary:
Accept-Encoding

X-Nf-Request-Id:
01H634AFBBHYHZNV88BG66VSQ4

Is there any resolution to this issue? Google search console couldn’t fetch sitemap for 5 years or so ?!

This is the URL of the sitemap:

[https://brain-beats.netlify.app/sitemap.xml

Hi, did you check your site’s “robots.txt” file to ensure that it allows search engine crawlers to access the sitemap? The robots.txt file should not block the sitemap’s path.

Also I’d recommend in the Google Search Console, use the “Fetch as Google” tool to check if Googlebot can access your sitemap without any issues. This will also help you identify any potential crawl errors.

There is no issue with “robots.txt” file

User-agent: *
Disallow:

I tried to fetch as google but it couldn’t fetch it. There are no crawl errors.

Why’s your service worker serving your sitemap? That looks like invalid configuration to me. Not like that’s causing the error with search console, but regardless, that’s not a good sign.

In any case, the sitemap is being served correctly from our end:

curl -I "https://brain-beats.netlify.app/sitemap.xml"
HTTP/2 200 
accept-ranges: bytes
age: 0
cache-control: public, max-age=0, must-revalidate
content-type: application/xml
date: Sat, 12 Aug 2023 13:22:10 GMT
etag: "c165af8680f24fbbf9bef16c2746e1c4-ssl"
server: Netlify
strict-transport-security: max-age=31536000; includeSubDomains; preload
x-nf-request-id: 01H7MYJMZB58HF8BHBR2MK6X40
content-length: 3395

We’re serving the file correctly, if Google is unable to read it, you might want to try contacting Google for help, or fetch us some logs from their end that indicate a failure on Netlify’s end.

So the issue was fixed after I got a domain from hostinger. (without removing the sitemap from the service worker.) So the plausible problem in subdomain?

Again, without any error message or visibility into Google’s error logs, we cannot say anything. We’re serving the file without errors.

i didnt have issues when my blogs were in subdomains. but could be possible from the hosting itself ?