Issue: Occasional Slow Site Performance - Investigating how to Generate CDN Edge Cache Programmatically

Hi all,

I’m looking for the elusive 100 / 100 performance score on web.dev, and when testing my current project, I’m noticing that the load times vary greatly. After an initial support ticket, it appears that data for sites are stored in a main data center, and served to the edge upon request. Here is the issue, when the site is loaded from Edge Cache, it’s blazing fast, but when the current edge server doesn’t have cache, it loads much slower.

With this said, is there a way for me to tell edge instances of Netlify to create a cache entry for my site? Ideally via CLI, an API so that upon each new build, I can generate the edge cache to have the site perform correctly.

Below is the site in question when it hits a node that has cache, and one when it doesn’t.

Not completely relevant, but, maybe, you can consider using service workers to cache the website, so, any loads after the initial one is fast (and would even work offline). SWs generally gift a huge performance boost.

Hi, @findcanary. There is no way to force your site to stay in the edge cache constantly other than to have such constant traffic to it that never leaves the cache. Even then, if there was higher traffic from other sites for the CDN node in question, there would still be the potential to be removed from the edge cache.

To summarize, it is a nearly impossible scenario for your site to always been in the edge cache. If you are looking for improved CDN performance there is an Enterprise plan which uses a different CDN.

The service worker recommendation is a good one as well.

Hey, thanks for the information! I wanted to know if it’s possible before I selected other options. Have you used the Enterprise CDN before?

While I understand Service Workers help the User Experience, I don’t think it will help maintain the perceived load time from Google would it? Maybe there is something I’m missing here.

Since Google’s audit tools are not going to store your service worker cache, it won’t help there. However, if you use Chrome’s built-in Lighthouse audit, it might reflect there as your files will be served by service worker’s cache.

However, in my personal opinion, a 100 score in the audit is merely an unnecessary obsession. What you have now is already great. So, I won’t suggest to actually work for the remaining 4 points as in the end, benchmarks are always different from real world usage. For example, if someone’s trying to visit your website from a 3G connection, the experience for them is going to be pathetic if you only focus on getting a 100 when Google tests your website using its server’s speed. Instead, I would try to focus on making experience better in my own little ways for those who might be using slower connections. For them, service workers would be a great help if they have already visited your website before.

And as it’s already stated above, it’s not possible to force your website into cache. So, you need not worry about the score in 70s as it’s something out of your control.

@hrishikesh,
So my issue isn’t with the 4 points at the end, I’m completely satisfied with my scores when in the 90s, however, there are many times it scores in the 40s, 50s, 70s and 80s depending upon the response time of Netlify’s CDN,. This is the core issue I’m looking to solve as there are many times, it’s just outright slow. When I run directly from a physical server, the same site scores 94 - 98 100% of the time.

Yeah, just as I said. Server speed is something that’s not in your control unless you’re running your own server. So, you should not bother about things you can’t control. Even I sometimes get performance in 60s and sometimes in 90s. That’s just how it is.

Well ultimately server speed is in my control, I just need to select a different hosting provider :).