I’m having this problem that might be related to Netlify. I push changes. I verify them. I invalidate and redo the Facebook scrape. Facebook still has the old information.
Does anyone have a Netlify site and manage its Facebook OpenGraph information?
Mind helping us with the URL of the page to check for the issue?
Have you turned on Prerendering by any chance?
Yes, I have turned on prerendering
Then there’s a huge chance that, it’s Prerendering that’s causing the issue. Prerendering serves a cached response to certain user agents of these scraping bots because of which, your changes are not immediately reflected. I think the cache persists for 24 to 48 hours. If you want the changes to be visible immediately, you’d have to disable prerendering.
2 Likes
That was it. Thanks. So I have a question. Does netlify prerendering accomplish what Next.js prerendering does? Why would a netlify user use Next.js or Gatsby?
I’m still confused about pre-rendering. When I have pre-rendering set and View Page Source for my site, I only see a little html and script tag links to javascript. I expect the html to be pre-rendered.
I currently have two distinct index.html files so that my single repository can provide two separate websites. If pre-rendering can let me accomplish this by putting variables in a single index.html file, I would like to do that instead.
I don’t know what Next.js prerendering does, so I can’t comment on that. But, I can instead explain what Netlify prerendering does, so maybe you can compare.
When turned on, Netlify prerendering will basically process your page’s JS and keep a prerendered page in a cache. Whenever a User Agent like that of Facebook, or some Search Engine requests the page, they are served that cached response instead of your original page.
That’s a rather strange question. Maybe you want to ask something else? Could you rephrase it? Because from what I know, there are many uses of Next.js and Gatsby. For starters, you can’t build a website using Netlify alone, but with the other two options, you can.
You don’t see it because you’re not served the pre-rendered version of your website. You’d always see the fresh content. Because prerendering is heavily cached, Netlify won’t want your users to see the outdated content for the next 48 or so hours. Thus, common users are not affected.
From what I remember from your previous thread, you wish to manually control what these bots can see. Prerendering won’t let you do that. Netlify’s prerendering will render whatever you’ve uploaded, can you don’t really get any control in that area. It’s like, all or none, no in-between.
You might have some luck with Snippet Injection, but I personally haven’t used it yet and thus, not sure if it’s a perfect fit for this use case. But you can go ahead and check it out.
1 Like
Thanks. I think I’ll turn off pre-rendering. I don’t see clearly how this benefit of pre-rendering serves me: “some robots and crawlers need “help” navigating client-side links”.
My site is just one long page with lots of anchor tags. I don’t really understand what help robots need exactly or what problems happen with them not having this help. It would be nice if @fool would add some more explanation to that first paragraph. I don’t understand what problem netlify pre-rendering solves.
When you build websites with React, in some cases they tend to use client-side navigation. That is, they just build one index.html
for the entire website, and manage the navigation to other pages using JavaScript. So, if these crawlers or bots don’t parse JavaScript, they would never be able to see other pages. This is where Prerendering helps. It parses JS and keeps these pages ready to serve to such bots.
1 Like