Unexpectedly Large Cached Dependencies / Long Cached Dependencies Extraction Time

Every build fetches and extracts cached dependencies before proceeding. The time taken has grown substantially and monotonically over the lifetime of the project, from barely a few seconds to almost a minute and a half. What can I do to investigate why and reduce it?

Here’s an example:

2:36:13 PM: Fetching cached dependencies
2:36:13 PM: Starting to download cache of 1.7GB
2:36:28 PM: Finished downloading cache in 14.997130807s
2:36:28 PM: Starting to extract cache
2:37:35 PM: Finished extracting cache in 1m6.901284412s
2:37:35 PM: Finished fetching cache in 1m21.985620897s

Obviously, the first question is: are the project’s dependencies in fact that large? du -h node_modules reports a total size of 1.3GB, which admittedly is not far off from the 1.7GB reported above. Nevertheless, I would like to understand what accounts for the difference.

Over the life of working on the project, I’ve only observed the reported size of the fetched dependencies to increase, never decrease. That’s true even after removing dependencies. Is it possible the cache is still including old dependencies that are no longer used? The project is a JS project using npm (not yarn).

Here are other things I’ve done so far:

  • I ran a build that executed du -a ../cache/ to inspect the contents of the build server cache directory. The cache directory only has 500mb of content. Surprisingly, I didn’t see my project’s node_modules there, only build server dependencies (node versions, pip, ruby, etc.)
  • Removed large dependencies from my project, ran “Clear cache and deploy site”, then ran it again using the dependency cache as normal. There was no change in the reported fetched cache size. (Granted, these builds failed because various function dependencies were missing, so if Netlify’s logic is to only cache dependencies on successful builds, that would explain why this experiment had no effect. But the build logs still do have a “Caching artifacts” section at the end, even on failed builds.)

I’ve noticed something else curious.

I happen to have a completely separate Netlify site configured to point to a different branch on the same repo as the above site. They have the exact same dependencies, and much of the time the branches are completely in sync. And yet on this site, the cached dependencies are substantially smaller:

4:49:49 PM: Fetching cached dependencies
4:49:49 PM: Starting to download cache of 517.0MB
4:49:56 PM: Finished downloading cache in 6.34106003s
4:49:56 PM: Starting to extract cache
4:50:16 PM: Finished extracting cache in 19.986166069s
4:50:16 PM: Finished fetching cache in 26.43722241s

How could this be?

One possibility is my speculation above that the cache includes old dependencies that were subsequently removed. Since this second site is much newer than the first, old dependencies that were removed before the site was even created wouldn’t contribute.

Hey there, @AndrewK :wave:

Thanks for your patience here. It looks like this thread has been pretty quiet since you reached out! Are you still encountering this situation? If so, can you please share a link to your site as well as your most recent deploy log so that we can look into this further and best advise?

Thanks!

Hi Hillary, no problem, thanks for your reply.

Yes, this issue is still occurring.

I’ll message you directly with the details of the two sites and the latest builds.

Hey @AndrewK,

@hillary shared with us the details you shared with her. I’ve then exported the build cache for both of those builds. Yes, the difference is significant, even more if you compare the unzipped folders.

The small one is at about 1.97 GB (1,36,851 items) and the large one is at about 7.54 GB (1,44,220 items).

I can share the zips with you if you want to inspect those yourself. But I think you can try “clear cache and deploy” which should get rid of the previous cache and the size should be more realistic. Furthermore, the cache is not permanently stored. So if you don’t publish your site in a while, the cache will eventually be dropped and thus, we’d have to create a new cache anyways.

Does clear cache option work for you?

1 Like

Thanks hrishikesh, I appreciate those details. It’s good to know that the cache is compressed – I had assumed so but wasn’t positive.

It would be great if you could share the zips with me, as they would help me diagnose what’s accounting for the difference. What’s your preferred way to do that? (Obviously I would prefer it not be here in a public thread!)

I’ve run “clear cache and deploy” many times on the site, but it didn’t resolve the issue. Any idea why not? My guess is that, while “clear cache and deploy” rebuilds the site from scratch without using the cache, it doesn’t fully replace the existing cache once done. For example, the cache might be appended to rather than overwritten, or there might be multiple build servers with independent caches, so clearing it on one doesn’t help with the others. But that’s just a hypothesis.

As for not publishing the site in a while, we’re CI/CD and typically deploy multiple times per day. That said, we had a period in late December for the holidays where we had no deploys for about a week, and that had no effect.

I’ll recreate readable zips of those unzipped folders and send Google Drive links to you privately. This will most likely happen tomorrow as it’s taking pretty long to even copy it all to a zip folder simply because of the number of files. So, it’s estimating a few hours to complete.

About clear cache and deploy, this is how it’s supposed to work:

When you clear cache, we start with a clean slate - we fetch your repo and download your build dependencies all new - like it never existed before. Then, once the build is done, we save the cache of this specific build to be used by the next build. We don’t append to the same cache, definitely not when the cache is cleared as there is no cache for that build to append the new cache to.

If that’s not working as described for you, it would need further investigation as that is not expected behaviour. If you can share those deploy links, that would help us check this further.

About the second point, we usually store cache for about a month or two. So this is probably why the cache was maintained during the holiday season.

Great, thanks very much.

Regarding how the cache clearing mechanism is supposed to work, that’s interesting. Does the behavior you described apply regardless of whether “clear cache and deploy” is run on a preview deploy vs. a production deploy? It’s quite likely I’ve only ever done it on a preview deploy.

In the meantime, I’ve run an experiment which may help shed some additional light. It appears “clear cache and deploy site” worked temporarily, but not entirely, as you’ll see below.

Automatic preview build on 3-Feb
This is a recent example build where the cache is unexpectedly large. (Timestamps are from yesterday, 3-Feb, not today).

2:48:26 PM: Fetching cached dependencies
2:48:26 PM: Starting to download cache of 1.8GB
2:48:38 PM: Finished downloading cache in 12.37429732s
2:48:38 PM: Starting to extract cache
2:49:51 PM: Finished extracting cache in 1m12.663144568s
2:49:51 PM: Finished fetching cache in 1m25.102439357s

Manually retriggered today 4-Feb with “Clear cache and deploy site”
I manually retriggered the above build with “Clear cache and deploy site”, creating this new build.

2:15:32 PM: Building without cache
2:15:32 PM: Starting to prepare the repo for build
2:15:33 PM: No cached dependencies found. Cloning fresh repo

Manually triggered again today 4-Feb with “Deploy site”
After the above completed, I manually retriggered that same build again, now with “Deploy site”, creating this new build.

2:33:25 PM: Fetching cached dependencies
2:33:25 PM: Starting to download cache of 541.9MB
2:33:29 PM: Finished downloading cache in 3.500149014s
2:33:29 PM: Starting to extract cache
2:33:51 PM: Finished extracting cache in 22.108303766s
2:33:51 PM: Finished fetching cache in 25.705148758s

This is good and promising! The cache size is much smaller, as expected.

Automatic preview build on a new pull request on 4-Feb
After the above, I created a brand new pull request which triggered this automatic preview build:

2:41:50 PM: Fetching cached dependencies
2:41:50 PM: Failed to fetch cache, continuing with build
2:41:50 PM: Starting to prepare the repo for build
2:41:51 PM: No cached dependencies found. Cloning fresh repo

It’s unclear why it failed to fetch the cache. I canceled that build and retried it with “Deploy site”.

2:42:16 PM: Fetching cached dependencies
2:42:16 PM: Failed to fetch cache, continuing with build
2:42:16 PM: Starting to prepare the repo for build
2:42:17 PM: No cached dependencies found. Cloning fresh repo

Again it failed to fetch the cache. I decided to create another fresh pull request, below

Automatic preview build on another new pull request on 4-Feb
I created another fresh pull request as a test, creating this automatic preview build:

2:43:53 PM: Fetching cached dependencies
2:43:54 PM: Starting to download cache of 1.8GB
2:44:10 PM: Finished downloading cache in 16.381322342s
2:44:10 PM: Starting to extract cache
2:45:22 PM: Finished extracting cache in 1m12.333293177s
2:45:22 PM: Finished fetching cache in 1m28.812956181s

Once again, the cache is back to 1.8GB!

About 25min later, I created another pull request, this time just as part of normal development, resulting in this preview build:

3:09:49 PM: Fetching cached dependencies
3:09:49 PM: Starting to download cache of 1.8GB
3:09:59 PM: Finished downloading cache in 10.293562219s
3:09:59 PM: Starting to extract cache
3:11:03 PM: Finished extracting cache in 1m4.048569112s
3:11:03 PM: Finished fetching cache in 1m14.418595856s

And again, the cache is at 1.8GB.

The cache for each type is maintained differently. So production branch cache is only used for the next deploy on the production branch, any other branch deploy uses the cache only from that branch’s previous deploy and similar rules apply to deploy previews.

About the rest of the investigation, I’ll try that tomorrow or day after since it’s way late right now - about 2 AM here. Meanwhile, I’ve shared the folder via Google Drive. Let me know if you see it.

Thanks, yes, I have access to the files. I looked through them, and the difference in size is due to the node_modules/.cache directory. That’s in my court to further investigate and understand.

Thanks for the info.

Just to clarify, suppose I open a PR pointing to the production branch, thereby triggering a preview deploy. I see that the preview deploy build uses the dependency cache. But which dependency cache? Is it (a) the same dependency cache that production deploy itself uses, or (b) a separate dependency cache used by preview deploys but not by production deploys? From your explanation, it sounds like it’s (b). But if that’s the case, it’s a puzzle why, in my experiment above, clearing the cache on a preview fixed the issue only temporarily, but the issue resurfaced for a subsequent preview.

My hypothesis is that some aspect of (a) does in fact happen (perhaps only when the preview build is first run, and not subsequent rebuilds of that preview). As a consequence, perhaps I can resolve this simply by running “clear cache and deploy” on my production build (something that I may never have done, or at least, not in a long time).

A further update. I went ahead and ran “clear cache and deploy” on my most recent production build, then created a new test PR to trigger a preview build, and that preview build now has the smaller cache dependency as expected.

This lends credence to my hypothesis in my last post that there is in fact a relationship between the production dependency cache and the dependency cache for previews of production.

Assuming this continues for more preview builds in the coming days, I can consider the issue resolved.

Your hypothesis might make sense. Since Deploy Previews are generally some changes over the branch, chances are we might be using that branch’s cache for the previews. Let us know if this bothers you again and we can take another look.