File based caching in /tmp directory

Site name: iamsterdam.com
Team name: iamsterdam

Hi,

We’ve been using file based caching for (a.o.) responses from our (graphQL) CMS. It’s pretty simple: we just give each graphQL request a unique id and then write the cache to /tmp/cache. This seemed to be working basically flawlessly: our website was fast and we could clear all cache by just clearing the whole /tmp/cache folder.

In the last few weeks however we have some problems: cache doesn’t seem to be very effective which makes our website a lot slower and our CMS client receives a lot more requests.

Now I’ve added an API call to check whats inside the /tmp/cache directory at any given moment and concluded that it is switching between different file contents and thus different servers. I’ve verified this by writing a file with a unique id to each server’s /tmp folder if it doesn’t exist yet. I get a different id most of the time, but the same id (and cache files contents) does come by multiple times.

Our caching is now basically unusable, because everything gets cached multiple times and when we try to clear all the cache, only the cache files on a single server are removed.

So my question is: has something changed in the way the /tmp folder for a site works? Did it always use the same server and folder location before and has that changed in the last few weeks?

/tmp folder on Netlify? I’m assuming you’re talking about Netlify Functions? If yes, I don’t think this was ever expected to work. Each invocation can boot into an isolated container which doesn’t share the resources with the previous invocation, so if this was working in the past, you got lucky at best.

Note that, Functions run on AWS Lambda, so that’s another place where something can change.

Thanks for your reply. I see that containers are being re-used, so I guess each container had build up the cache in all the containers /tmp directory which is why it (kinda) worked.

What I am wondering about though is how I would handle caching certain resources that are being loaded (via an external API for example) in Netlify functions (via nextJS’s getServerSideProps or api). There must be a way to not fetch the data again on every invocation (in other words: on every user request) but instead use some cached data?

Yes, but it’s not reliable or guaranteed which is why I mentioned, each invocation can boot an isolated container.

Why not use ISR for this? If that’s not an option, there’s no caching in Netlify Functions that you can do. You can store the data in an external database, but you’d still have to fetch it everytime.

1 Like