I have been using Ahrefs to crawl my site each weekend the past few months while i’ve been doing SEO updates. My site name is bookeasyclean. A couple weeks ago, I started getting 504 gateway errors on a lot of pages out of nowhere. I did a little digging and saw where it was suggested I disable pre-rendering. I did that last weekend. Today my site is uncrawlable due to the robots.txt file timing out when the Ahrefs bot tries to crawl my site. The AhrefSiteAudit Crawler tool is returning: Fetching robots.txt took too long
Hello, I checked and saw that you have solved it. I am also having the same problem. “Fetching robots.txt took too long”. Can you share how you solved this problem? Thank you.