My site: https://zuko-iroh-tutor.netlify.app/
I am using an edge function to deliver content and I have opted into caching by configuring it in netlify.toml:
[[edge_functions]]
function = "serve_content"
path = "/*"
cache = "manual"
[[headers]]
for = "/*"
[headers.values]
Cache-Control = "public, max-age=3600, s-maxage=86400, stale-while-revalidate=86400"
Then I set the headers inside the edge function like this:
const headers = new Headers(response.headers);
headers.set('Netlify-CDN-Cache-Control', 'public, max-age=3600, s-maxage=86400, stale-while-revalidate=86400');
headers.set('Netlify-Vary', 'query=url');
return new Response(response.body, {
status: response.status,
statusText: response.statusText,
headers: headers
});
};
// Enable caching for this edge function
export const config = {
cache: "manual"
};
The caching then works fine, when I rapidly test it with multiple requests, the first one is a miss and the other hits. But when I wait 1 minute the same exact request from the same agent returns either a miss, or stale. When the result is stale it does not return the content as it should with stale-while-revalidate.
I thought it could be that the agent I am using to test actually somehow changes the request and it misses the cache, but then it would just be a miss and not stale.
To try and cache more agressively I set also set global caching rules in the toml. ( I understand this does not make sense, my understanding is that there is an edge cache that is before the edge function and then a cache for the static files that the edge function may retrieve if it runs and the config in the toml is for the second cache)
With the current setup I sometimes get this as the final response, and sometimes miss and stale are swapped:
Cache-status: "Netlify Edge"; fwd=miss, "Netlify Edge"; fwd=stale
This is the whole code of the edge function:
export default async (request, context) => {
const isStaticFile = request.headers.get('X-Is-Static-File') === 'true';
const userAgent = request.headers.get('user-agent') || '';
const url = new URL(request.url);
console.log(`[SERVE-CONTENT] Received userCategory: ${userCategory} | isStaticFile: ${isStaticFile}`);
// Handle bot requests with prerender
if (userCategory === 'bot' && !isStaticFile) {
console.log(`[SERVE-CONTENT] Bot detected, starting prerender for: ${url.pathname}`);
const prerenderUrl = new URL('/.netlify/functions/prerender', request.url);
prerenderUrl.searchParams.set('url', url.pathname);
prerenderUrl.searchParams.set('user-agent', userAgent);
console.log(`[SERVE-CONTENT] Prerender URL: ${prerenderUrl.toString()}`);
try {
const prerenderResponse = await fetch(prerenderUrl.toString(), {
headers: {
'User-Agent': userAgent,
'Accept': 'text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8',
'Accept-Language': 'en-US,en;q=0.5',
'Accept-Encoding': 'gzip, deflate',
'Connection': 'keep-alive',
'Upgrade-Insecure-Requests': '1'
}
});
const prerenderContent = await prerenderResponse.text();
const prerenderHeaders = new Headers();
// Copy essential headers from prerender response
if (prerenderResponse.headers.get('content-type')) {
prerenderHeaders.set('Content-Type', prerenderResponse.headers.get('content-type'));
}
if (prerenderResponse.headers.get('content-encoding')) {
prerenderHeaders.set('Content-Encoding', prerenderResponse.headers.get('content-encoding'));
}
if (prerenderResponse.headers.get('x-prerender-request-id')) {
prerenderHeaders.set('x-prerender-request-id', prerenderResponse.headers.get('x-prerender-request-id'));
}
if (prerenderResponse.headers.get('server-timing')) {
prerenderHeaders.set('server-timing', prerenderResponse.headers.get('server-timing'));
}
// Add our own headers
prerenderHeaders.set('X-Bot-Type', 'prerender');
prerenderHeaders.set('X-Processing-Method', 'edge-direct-fetch');
console.log(`[SERVE-CONTENT] Prerender successful - Status: ${prerenderResponse.status} | Content-Type: ${prerenderResponse.headers.get('content-type')} | Content-Length: ${prerenderContent.length}`);
return new Response(prerenderContent, {
status: prerenderResponse.status,
headers: prerenderHeaders
});
} catch (error) {
console.error(`[SERVE-CONTENT] Prerender fetch failed: ${error.message}`);
console.log(`[SERVE-CONTENT] Falling back to redirect for: ${url.pathname}`);
// Fallback to redirect if direct fetch fails
return new Response(null, {
status: 302,
headers: {
'Location': prerenderUrl.toString()
}
});
}
}
// Handle all other requests (humans, static files, etc.)
console.log(`[SERVE-CONTENT] Processing ${userCategory} request for: ${url.pathname}`);
const response = await context.next();
console.log(`[SERVE-CONTENT] Upstream response - Status: ${response.status} | Content-Type: ${response.headers.get('content-type')}`);
// Create new headers with cache settings
const headers = new Headers(response.headers);
headers.set('Netlify-CDN-Cache-Control', 'public, max-age=3600, s-maxage=86400, stale-while-revalidate=86400');
headers.set('Netlify-Vary', 'query=url');
console.log(`[SERVE-CONTENT] Final response - Status: ${response.status} | UserCategory: ${userCategory} | Headers: ${headers.size} total`);
return new Response(response.body, {
status: response.status,
statusText: response.statusText,
headers: headers
});
};
// Enable caching for this edge function
export const config = {
cache: "manual"
};
I would be exremely grateful if anyone has an idea what I could be doing wrong. I know the configuration is not perfect, but even though it is configuring some things twice I don’t understand how the result could be a cache with a short max age.
Thank you in advance to anyone.