I don’t know what Next.js prerendering does, so I can’t comment on that. But, I can instead explain what Netlify prerendering does, so maybe you can compare.
When turned on, Netlify prerendering will basically process your page’s JS and keep a prerendered page in a cache. Whenever a User Agent like that of Facebook, or some Search Engine requests the page, they are served that cached response instead of your original page.
That’s a rather strange question. Maybe you want to ask something else? Could you rephrase it? Because from what I know, there are many uses of Next.js and Gatsby. For starters, you can’t build a website using Netlify alone, but with the other two options, you can.
You don’t see it because you’re not served the pre-rendered version of your website. You’d always see the fresh content. Because prerendering is heavily cached, Netlify won’t want your users to see the outdated content for the next 48 or so hours. Thus, common users are not affected.
From what I remember from your previous thread, you wish to manually control what these bots can see. Prerendering won’t let you do that. Netlify’s prerendering will render whatever you’ve uploaded, can you don’t really get any control in that area. It’s like, all or none, no in-between.
You might have some luck with Snippet Injection, but I personally haven’t used it yet and thus, not sure if it’s a perfect fit for this use case. But you can go ahead and check it out.