Server Side Rendering with Netlify Functions

I’d love to know if Netlify Functions can be used for server side rendering.

This is what needs to happen:

Build

  • Netlify runs the build command to generate both the server lamdba and the static files.
  • Netlify uploads the server file (for example /build/server.js) to AWS lambda.
  • Netlify uploads static files found in a folder (for example, /build/static/*).

Runtime

  • Netlify serves the static files when the /static/* URLs are accessed. Those files are cached by the CDN according to cache-control headers.
  • Netlify servers the HTML generated by the /build/server.js lambda to any other URL. Those HTML responses are cached by the CDN according to cache-control headers.

I took a look at the netlify.toml and I saw you can configure redirects.

  • Could those redirects be used to make Netlify serve HTMLs generated by the server lamba?
  • If that’s the case, could those HTML responses be cached by the CDN?
  • Is there any other way I’m not aware of to make this happen?
1 Like

Thanks for this question @luisherranz!

I’m actually in the process of writing a post which explains a model similar to this right this very moment! (With some differences in the nuance I suspect). I presented a little demo during a presentation at NEJS. (You might find the slides helpful)

The model I was exploring was in allowing user generated content to be submitted via a form which would create content on a new URL and initiate a regeneration of the site to include that new page. While that page was being generated. I use Netlify’s custom 404 handling in the redirects to pass requests to that new URL to a serverless function which gets the content directly from the content API and does a serverless render.

Once the site has been regenerated and the new URL exists, the request to it no longer 404 and are simply satisfied by the pre-generated static page.

While I finish up my detailed post describing this all a bit more clearly, you might like to play with the demo as it is now, and you can also node around in the code.

Demo: https://vlolly.net
Repo: GitHub - philhawksworth/virtual-lolly: JAMstack demo site - prerendered with serverless API fallbacks

I think of it as static first, with a serverless render fallback.
And I think this gets pretty close to the model you describe.

That’s interesting. Thanks for sharing @philhawksworth.

I have created a simpler repository and I think I have managed to make it work :slight_smile:

This is the live site: https://flamboyant-euclid-90b896.netlify.com/

This is the toml file:

[build]
  command = "npm run build"
  publish = "build/static"
  functions = "build"

[[redirects]]
  from = "/static/*"
  to = "/:splat"
  status = 200

[[redirects]]
  from = "/*"
  to = "/.netlify/functions/server"
  status = 200

The server.js lambda is taking care of all the urls with the /* redirection:
https://flamboyant-euclid-90b896.netlify.com/
https://flamboyant-euclid-90b896.netlify.com/some-blog-post
https://flamboyant-euclid-90b896.netlify.com/category/some-category

And the static files are properly served with the other /static/* redirection.
https://flamboyant-euclid-90b896.netlify.com/static/static.js

For some reason adding [[headers]] to /* doesn’t work for lambdas:

[[headers]]
  for = "/*"
  [headers.values]
    X-Custom-Toml = "Toml"

The _headers file is also not working for lamdbas:

/*
  X-Custom-Root: Root

Both of these work fine for the static file. I don’t know if I’m doing something wrong or that is a bug.

They only work if they are returned by the lambda:

exports.handler = (event, context, callback) => {
  setTimeout(() => {
    callback(null, {
      statusCode: 200,
      body: `Dynamic page. Path: ${event.path}. Random: ${Math.random()}`,
      headers: {
        "Cache-Control": "public, s-maxage=15, stale-while-revalidate=300"
      }
    });
  }, 3000);
};

This lambda simulates a SSR request with a setTimeout of 3 seconds.

The CDN works, and it is honoring the s-maxage directive. It serves the cached version for 15 seconds.

But the CDN is ignoring the state-while-revalidate directive. After 15 seconds, it doesn’t serve a the stale version of the cache, it serves a new asset, making the user wait 3 seconds again.

Is that a bug? Should I open an issue somewhere?

you cannot set custom headers via netlify.toml or_headers for functions; your function needs to return the headers itself, instead, as you’ve discovered. That is operating as expected.

I think you need to rely on max-age/public rather than stale-while-revalidate to achieve your goals; I don’t know what we intend to do with stale-while-revalidate for functions or if our proxy (which connects the function to a web request) handles it correctly, but since your experiment indicates no, I wouldn’t count on it (or on that changing anytime soon).

I’ll review with our operations team next week whether they feel like it is a bug or just intended not to work and follow up here if we come up with anything interesting.

1 Like

The guys from Zeit call it serverless pre-rendering although we’ve already been doing this for years for our clients with KeyCDN. Most CDN’s support the state-while-revalidate directive as far as I know.

The mechanism is simple:

  • The first time a URL is visited, the CDN requests the static HTML to the serverless function, serves the file and saves it in disk/memory for later use. This is the normal behaviour of any CDN, of course.
  • The second time a URL is visited, CDN serves the static file. Still normal.
  • The change is here: once the file is stale (controlled by the s-maxage directive), if the state-while-revalidate directive is used, the CDN returns the stale static HTML file to the user. After that, it requests a fresh one in the background to the serverless function. It then overwrites the stale one in its disk/memory.
  • The next user who visits the site gets the fresh static file from the CDN.

The time-to-stale is controlled by s-maxage directive and it can be 1 second (always check for a fresh file in the background) or any other time, for example 15 minutes. That obviously depends on the site needs. It can be further optimized setting s-maxage to the max and using manual cache invalidation, usually with a soft-purge hook of the CDN API that marks all the files as stale.

As you can see, the approach is similar to the static-generated site, but it has some benefits for medium/large publisher sites with thousands/tens of thousands posts: the static HTML files are generated on demand and over time. The static HTML files most used are generated first and served instantly, while the rest of the static HTML files are generated over time, if ever. And the final user always gets static files from the CDN (but the very first time).

It’d be amazing to be able to use this approach with Netlify :slight_smile:

4 Likes

@philhawksworth very impressive!
image

2 Likes

Hi @fool :wave:!

Just wondering if netlify has any plans to support the stale-while-revalidate headers in a way that is similar to what Vercel is doing: Caching on Vercel's Edge Network | Vercel Docs

It would be awesome to be able to use an approach outlined by @luisherranz above :slight_smile:

Hey @michalczaplinski,

Could you elaborate a little bit more on what you’re trying to achieve here? Maybe there’s a methodology which just isn’t springing to mind. I think we need to hear your use case explained differently to perhaps conjure a solution which a) works for you and b) is standards compliant :smile:!

Hey @Scott :wave:

Sure! Our use case is the framework that we are building for creating wordpress-powered sites with react: https://frontity.org/

In terms of architecture, the framework is probably most similar to next.js. The issue that prevents us from using and recommending netlify is this:

So, the sever-side rendering works, but the CDN is ignoring the stale-while-revalidate directive if it’s returned from the serverless function. Because the server-side rendering can be relatively slow with react, we want to always serve the potentially stale page from the CDN while revalidating in the background.

Vercel describe this in more details in https://vercel.com/docs/v2/serverless-functions/edge-caching#cache-control

So, we’re wondering if netlify does/will offer something similar :slight_smile:

Hey @michalczaplinski,

Sorry for the delay in getting back to you!

At present, we don’t intend on supporting that setting. However, something along these lines is achievable and, I think, there’s a couple of different ways to do so.

Firstly, have you checked out Google’s offline cookbook? You might find that service workers are capable of what you’re setting out to achieve. Something like a cache then network response, perhaps? Our friends at Smashing Magazine have touched on this before.

If you’re not keen on that approach, could you make use of other cache-control headers? Set max-stale to 3 seconds, for example?

Hey, @Scott, thanks for the answer.

Another question: is there a way to invalidate/purge the cache (CDN) via an API request? I mean, other than generating a new build, of course. Just the cache :slight_smile:

1 Like

Hey @luisherranz,

As far as I’m aware, there isn’t – it’s not documented on our open API if there is! :stuck_out_tongue:

Ok, thanks @Scott.

Then I guess the only workaround is to set the s-maxage cache-control directive of the HTML files to their maximum (31536000) to make them immutable and use a WordPress plugin that triggers a new build (Search Results for “netlify” | WordPress.org) and invalidates the CDN each time a post is published/updated.

It’s a shame because the new build is not necessary for Frontity, the cache invalidation is enough, but at least people would be able to use Netlify if they want.

@luisherranz,

Probably not the best thing to be doing! Our CDN handles expiration normally, you’ll only need to consider revising this if you’ve set anything in your proxy response headers.

Yeah, I guess people could set s-maxage to something like 30 minutes for example, but that would mean the content wouldn’t be instantly updated when a post is published/updated, which is not ideal.

I think it’d be better to set s-maxage to the maximum to simulate an immutable, and then trigger the CDN invalidation manually from WordPress.

Is there any other way to invalidate the CDN than triggering a new build?

I think perhaps you misunderstand how we invalidate cached assets. Have you read this: Better Living Through Caching? Basically, our CDN relies on filenames remaining the same and the cache-control header that we very intentionally set.

To answer your question: no, there isn’t any ‘trigger’ from your end to invalidate CDN node cache except through a build/deploy, as this is part of the atomic nature of deploys.

Hope that helps.

Nice. Thanks for the link :slight_smile:

And does that approach work for HTML responses returned by Netlify Functions or just for static HTML files?

@philhawksworth This is brillant! Thanks for sharing it. I just have one question, how would you go about handling if the data on the backend changes? Say the shape or color of the lolly?

Hey @Dennis, any plans to provide this trigger?
Imo would be really nice being able to purge cache on each deploy (as Firebase Hosting) or through API, this way we can set large s-max-age and prevent calling SSR lambda function when no needed.

Hey @Scott, I think none of those cases fit stale-while-revalidate intention, :thinking:
With stale-while-revalidate we can serve a faster response on server cache while a new request is also sent, SSR lambda called again and fresh response will be served (from cache again) next time.

Imo the best scenario for ISG is calling lambda function (slower) only when really needed, in my case it means only when a new deploy is performed, and when function is called the response is cached and provided even when not fresh, while new one is being cached.

To answer your first question, cache is already purged on each deploy for assets hosted on Netlify. If you are proxying to another URL, we will use whatever caching value that the response gives us, so those assets may not be purged on deploy immediately, depending on the cache-control headers received.

As far as stale-while-revalidate, we do have a feature request for it and I’ve added this thread to the issue for additional context. There is definitely an intent to implement it. However, I don’t have information on if and when it will be released.

1 Like