Support Forums

Choosing on-demand builder vs. vanilla serverless response within a single handler


From the on-demand builder docs, as well as the NPM package’s exposed functions, it looks like the only supported way to use an on-demand builder is by wrapping the entire serverless function.

That means that the NPM package doesn’t provide a way to decide to return an on-demand builder response or a vanilla serverless response after doing any computation. A serverless function must be either an on-demand builder serverless function, or a vanilla serverless function.

The use case I’m trying to support is wiring up a render function from a JAMstack framework, and using the render function’s response to say whether it is handling a response from an on-demand route or not. Similar to how NextJS would return revalidate: 1 to indicate ISR, for example - you need to execute the code to know whether it should be cached or not.

From looking at the underlying source code of the Netlify functions NPM package, it looks like I could do a hack to include

    metadata: { version: 1, builder_function: true },

in my serverless function’s response to dynamically indicate during execution-time whether the response should be cached as an On-Demand Builder or not.

Without using that approach, the only other way I can think to solve this problem would be using redirect rules for all routes. The challenge there is that I have some routes that I can express as Regex, but I can’t express them using /blog/:slug syntax because they are a little more granular than that.

I’d love to understand the reasoning better and what the recommended workaround approach would be here. I’m guessing that the design of requiring a handler function to be wrapped before execution might be trying to prevent a foot-gun where the results of whether you get an on-demand cached response or not are less predictable or deterministic. But I’d be curious to hear whether that is indeed the rationale and whether that’s being explored more or is likely to remain as is in the current design.


Hi there!

I think you understood the rationale pretty well.

The cache mechanism is largely based on a cache key that is based off of the request url. If your function started to indicate different cache preferences for the same path it would lead to inconsistent caches and unpredictable results.

I’d encourage you to use redirects rules (and edge handlers in the future) to point your requests at either a cacheable or uncacheable function.

Since on-demand builders are still in beta we are heavily iterating on their internal architecture and implementation. The library and public docs we provide for this feature is what we will be careful to not break so that code you deploy today will still work in 5 years, which is an important property of websites that we try to uphold.
Once you start relying on implementation details of our libraries we can no longer ensure that and our support team will have to deal with the fallout of people using undocumented details, which we want to avoid.

This is largely the reason why we refrain from allowing you to have very fine-grained access to cache management. In the next months we will likely be making improvements to on-demand builders that would be a lot harder to make if we gave a lot of flexibility in cache management.

1 Like

Hi Marcus,

Thank you for the response.

The cache preference is deterministic, and given the same code and the same URL, it will always have the same cache preference. The challenge is supporting an API to let users of my JAMstack framework make that choice for fine-grained routes. At a very high-level, here is something that a user of my framework (elm-pages) might define using the API:

  • Here’s a route, users/all.json. I want this route to have serverless (uncached) responses
  • Here’s a route, users/[int].json. I want this route to have on-demand cached responses

That’s just an arbitrary example, but the main point is that users have fine-grained control of which cache strategy (pre-render, on-demand cached response, or serverless) to use. It’s deterministic, a given route will always use the same strategy. But it is defined in code through the framework. Here’s some simplified pseudocode to try to make things a little more concrete:


So as a framework author, I need a way to programmatically decide on a cache strategy. If there was a way to do regex-based redirect rules, then I could generate a _redirects file at build-time to route to the appropriate renderer function (one wrapped in an on-demand builder, and one that is a vanilla serverless function). But because the API I expose in the framework provides more fine-grained routing than the syntax in _redirects allows for, I can’t do that mapping at the redirect-rule-level.

If I’m not mistaken, I think there would be a similar challenge with providing a way for frameworks like NextJS to provide a way for users to say whether a route should use a DPR-cached on-demand handler or not. The current on-demand builder API seems more geared towards providing a way to manually define specific routes to use on-demand builders, but it seems to have a limited ability for frameworks to leverage this functionality, since it doesn’t provide a way to programmatically choose to use cached on-demand responses vs. vanilla serverless responses, while the logic for this in a framework is often defined programatically.

Anyway, just wanted to share my use case as I know this functionality is at an early stage and I think this use case is something other frameworks will run into as well. Hope that’s helpful input. Happy to share more details if it’s helpful.

Thanks for your input. That is great!

We use a build plugin to generate a _redirects file for nextJS sites: GitHub - netlify/netlify-plugin-nextjs: A build plugin to integrate Next.js seamlessly with Netlify
I believe since NextJS doesn’t support regex for route definition we’re able to translate between the routing mechanisms.

We don’t have plans to support regex, but edge handlers will likely give you this capability

1 Like

You’re right, I didn’t realize that. Forgive my limited NextJS knowledge! I’m in a different ecosystem and just look to these frameworks for inspiration.

I played around with NextJS page routes and API routes a bit, and indeed they don’t allow for more fine-grained routing. I had assumed the NextJS router would support routes like pages/post/[year]-[month]-[day].js, but it turns out they only support dynamic segments or non-dynamic segments, but not a mix of the two.

This is also the case for API routes, which I found surprising because that means you cannot have API routes like pages/api/users/[id].json.js, or pages/api/users/feed.xml.js. This can be limiting because these endpoints end up relying on the Content-Type header alone, whereas in some cases the file extension might be needed or preferred as well.

SvelteKit’s file-based router supports this kind of route. Here are some example SvelteKit API routes (AKA endpoints):

If a SvelteKit user wanted to use an on-demand builder, they would run into the same challenge. In particular, the main challenge I see is that API routes cannot have file extensions.

As a framework author, I could limit the patterns users could have in the file-based routes to work with the supported Netlify _redirects. For pages, I have that limitation already and that works well. The limitation I’m running up against is having a way to support file extensions.

Anyway, I hope that context is helpful. I would love to make on-demand builders a first-class citizen and have support for them in the next release of my framework. I had implemented a strategy like the netlify-nextjs-plugin you referred to that generated a _redirects file. That was working well until I hit up against the limitations I described, and I’m still not quite sure what the best path forward is at the moment.

Appreciate the insight in to your experience with ODBs, @dillonkearns! Assuming no imminent changes from our end, I think Marcus’ advice here will be the way to go today.