Deployment takes a really long time (1.5h) to process (mostly stuck in "Unpacking archive")

Hey there,

I have a reasonably large site that I update via API (Zip approach)

Site/deployment information.
Site id: stvad-fb764098-a72b-467f-b6e8-ff93a74268cc
Deployment package size: ~100mb
Number of files: ~20800

And it takes a really long time to deploy after upload. (1-1.5h)

And most of the time is spent in “Unpacking the archive” which does not seem reasonable to me. Sure it’s a large-ish archive, but it takes on the order of 10s of seconds to unpack locally. 1h seems extremely strange. Is there anything I can do to improve on that?

Second aspect is post processing which is while not as extreme as unpacking time - is still considerable. I don’t actually need post-processing for these site’s - There are no forms/etc, I mostly want to display them as is. Is there a way for me to disable post processing from API? (I know it’s at least partially possible from UI Disable form detection available in the Netlify UI but it doesn’t quite work for me)

Tried using netlify cli (to rely on digest upload method vs ZIP).
It’s done somewhat better on the initial upload - ~40minutes (which is still pretty bad but an improvement)
But then a subsequent upload with cli (where ~half of the files have changed - generated by Gatsby) is back to ~1.5h :frowning:
Is there anything I can do to speed this up?

Hi there @stvad and thanks for your patience while our team reviewed this.

It’s not unexpected for such a huge site to take that long to deploy; we have to scan most files with checksums we haven’t seen before, for any asset optimization settings you have enabled - and while I see you don’t have any explicitly enabled here:

…you could save some time by turning off form detection as well, here: Netlify App

But, the best time savings will come from not changing most files in your deploy, as you show is happening in your screenshot:


This article has more details on the speed gains to be had for a normal site update that DOESN’T change most files in the deploy:

Take a look and let me know if you have any questions, but I’d be shocked if your site took that long to deploy if you deployed, and then deployed the same zipfile with a single file changed, again. That would be an interesting experiment to hear the results of, if you do conduct it, as our team is looking into speeding up the processing (which is what is happening after the archive is unpacked - we have different processing steps some of which occur before post-processing begins :))

But that’s the thing most of the time is spent in “unpacking the archive” and not post processing. Disabling form detection helps in the removing ~10 minutes from the post processing phase, but the original “unpacking” still takes 1+ hour :frowning:

Will try the experiment with ~same archive, not sure if it’s gonna help me much though, as the file changes are driven by gatsby build and occur even if the source material is almost the same.

But even if files do change - why does it take such a long time to “unpack” if no processing is enabled?

Our logs are not 100% clear on what is happening. We don’t mention our PRE processing, which happens after the archive is unpacked. My description reflects this knowledge which I had and you did not :slight_smile:
So, you’ll want to reduce the changes in the the files to reduce the processing time, which is where we spend time, not in the unpacking.

You can control Gatsby’s tendencies to change everything - choose not to use the asset hash/fingerprinting feature.

I see, is there any way to disable any pre-processing and use uploaded site as is to speed things up?

You can control Gatsby’s tendencies to change everything - choose not to use the asset hash/fingerprinting feature.

Thanks will explore that, would appreciate links to more details on this too!

hi @stvad , i don’t think we have any specific guides as this is a gatsby-specific topic, but if you come across anything interesting, it would be great if you could share it with us!