I’m using the Netlify Prerender Extension on my site and prerendering is working perfectly for Googlebot and other recognized crawlers. However, I’ve discovered that Anthropic’s Claude crawler (ClaudeBot/Claude-Web) is not triggering prerendering.
Which site is this about?
thanks
Thank you for sharing. You’re right, Prerendering is not working for that user agent. I reviewed it further and looks like we don’t recognize that user agent as an AI Agent - mainly because it’s not an officially documented user agent.
Could you find any official source that confirms what this user agent is exactly supposed to do so we can correctly classify it?
Hi, first, I want to clarify that I am not an expert in SEO or web development. I built a website and deployed it on Netlify, and I’ve been reading some articles and running some tests to learn more about robots and crawlers.
First, when trying to see if my prerender extension was working with AI crawlers, I ran this command (note that there is no x-prerendered attribute):
$ curl -A "ClaudeBot" -I https://alvaroleonnutricion.co
HTTP/2 200
accept-ranges: bytes
age: 8126
cache-control: public,max-age=0,must-revalidate
cache-status: "Netlify Edge"; hit
content-type: text/html; charset=UTF-8
date: Sun, 12 Apr 2026 14:33:33 GMT
etag: "d4826afda2ce822c42c9bc3648a10425-ssl"
server: Netlify
strict-transport-security: max-age=31536000
x-nf-request-id: 01KP11SE57NM64J0J6JBZZVK0T
content-length: 1194
However, if I run the following:
$ curl -A "ClaudeBot/1.0" -I https://alvaroleonnutricion.com
HTTP/2 200
age: 1666
cache-control: public,max-age=0,must-revalidate
cache-status: "Netlify Edge"; hit
cache-tag: nf-prerender
content-type: text/html; charset=utf-8
date: Sun, 12 Apr 2026 14:34:45 GMT
netlify-vary: query
server: Netlify
strict-transport-security: max-age=31536000
vary: Accept-Encoding
x-content-type-options: nosniff
x-nf-request-id: 01KP11VJ6PYVQHCSC4FDB7HT57
x-prerender-timestamp: 2026-04-12T14:07:09.404Z
x-prerendered: true
The prerendering is triggered successfully. Investigating further, I found that Claude uses this user-agent: Mozilla/5.0 AppleWebKit/537.36 (KHTML, like Gecko; compatible; ClaudeBot/1.0; +claudebot@anthropic.com). Because of this, just using ClaudeBot does not produce a match, but ClaudeBot/1.0 does. Here, Anthropic explains which types of robots they use to index, search the web, or retrieve web content.
I went deeper with my research and found this page in the OpenAI models documentation: Overview of OpenAI Crawlers. They specify the user-agents used by each tool:
- OAI-SearchBot:
Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/131.0.0.0 Safari/537.36; compatible; OAI-SearchBot/1.3; +https://openai.com/searchbot - GPTBot:
Mozilla/5.0 AppleWebKit/537.36 (KHTML, like Gecko); compatible; GPTBot/1.3; +https://openai.com/gptbot - ChatGPT-User:
Mozilla/5.0 AppleWebKit/537.36 (KHTML, like Gecko); compatible; ChatGPT-User/1.0; +https://openai.com/bot
I tried each one with curl -A "<full_user-agent_string>" -I "https://alvaroleonnutricion.com/", and in all cases, the prerendering was triggered. I also asked ChatGPT to access my website and checked the logs, confirming that it effectively uses the third user-agent (I did the same check with Claude and also saw that the user-agent was the one specified above).
However, if you only use GPTBot or ClaudeBot as the user-agent, the prerendering is not triggered. I am not sure if this is an issue, but in my opinion, it shouldn’t be a problem. OpenAI, Anthropic, and presumably others like Gemini or Perplexity, use the full user-agent string, which properly triggers the prerender.