Hi James,
I think the sales team reached out to you around custom settings that are available to enable large sites.
However for the edification of everyone else:
- building the sites using a system like hugo is totally doable - we have folks building 40k+ page sites on hugo in a few minutes.
- the uploading is the problem. It takes a fraction of a second to upload each changed file to our CDN and at 70k pages, we’re likely talking about near 100k total assets considering I assume you have graphics and JS and CSS files and directories and whatnot. So multiply even 5 per second which is close to the speed we see, and you get 20,000 seconds which is…well, a lot of hours. While we can tune our system to handle this load, and we do for customers in that situation, it is not enabled by default.
However, the good news is that most sites do NOT need to change every file with every deploy, and when you just add some content or change a few pages, the upload will be much faster as we only need to upload changed files - we don’t re-upload files we’ve seen before (so we don’t deploy eleventy million copies of jquery.js, for instance).
This article talks more about the pattern of not changing all files and how it can be beneficial for upload times, but also for site speed: [Support Guide] Making the most of Netlify's CDN cache
Serving the files is no problem - getting them onto the CDN is the time-consuming part and hopefully if you optimize things a bit (some builds are fine by default; some are not and e.g. contain a timestamp or a CSS/JS filename with an ever-changing hash that is used in every single html file) our CDN is definitely up to BUILDING & SERVING sites at that size, but uploading them may need a bit of help from us