The User Agent Blocker is generating incorrect additions to robots.txt files based on selections found under Block user agents.
I did not enable the option The Actively block requests using an edge function in my testing.
I have not tested all the options fully, but it always appears to include GoogleBot and BingBot in the output even when those options are not selected.
If you select all the options (except the edge function), the output generated is:
User-agent: GPTBot
Disallow: /
User-agent: GoogleOther
Disallow: /
User-agent: GPTBot
Disallow: /
User-agent: GoogleOther
Disallow: /
User-agent: PetalBot
Disallow: /
User-agent: Factset_spyderbot
Disallow: /
User-agent: LINER Bot
Disallow: /
User-agent: ClaudeBot
Disallow: /
User-agent: Timpibot
Disallow: /
User-agent: GoogleBot
Disallow: /
User-agent: BingBot
Disallow: /
User-agent: YandexBot
Disallow: /
User-agent: DuckDuckBot
Disallow: /
User-agent: SemrushBot
Disallow: /
User-agent: AwarioBot
Disallow: /
Deselecting only the options under Search Engine Crawlers generates:
User-agent: GPTBot
Disallow: /
User-agent: GoogleOther
Disallow: /
User-agent: LINER Bot
Disallow: /
User-agent: ClaudeBot
Disallow: /
User-agent: Timpibot
Disallow: /
User-agent: GoogleBot
Disallow: /
User-agent: BingBot
Disallow: /
User-agent: SemrushBot
Disallow: /
User-agent: AwarioBot
Disallow: /
User-agent: DotBot
Disallow: /
User-agent: MJ12bot
Disallow: /
Note that GoogleBot and BingBot are still generated, but PetalBot and Factset_spyderbot are missing. The expectation should be that all the AI and SEO options would be included, and none of the Search Engine Crawlers are.
