I haven’t been able to find much information on this, so hopefully someone at Netlify or a community member who has worked on something similar could help me.
I’m trying to generate a PDF with a background function and prompt the client to download the file. I have read the docs (which actually do show how to generate a PDF) on how to create background functions but there is no response send back, so maybe I’m thinking about things incorrectly, but the background functions docs say:
When a background function is successfully executed, you generally pass the result to a destination other than the originating client.
Emphasis on generally my own addition. This seems to imply I can do what I’m hoping.
Why have a background function do what I’m asking?
- Generating a PDF while the client waits for a response takes a long time (poor UX)
- I have on occasion run out of memory during this operation, and as far as I understand, background functions provide more resources
What is currently happening?
- My background function endpoint responds immediately with a 2** —
- My function logs show that my function does run, but with no prompt at the client level to download a file
@zackseuberling What is happening aligns with what would be expected from reading the documentation, background functions aren’t intended to be directly used in the way you’re attempting.
Netlify’s Background Functions provide an option for serverless functions that run for up to 15 minutes and don’t need to complete before a visitor can take next steps on your site.
When a function is invoked asynchronously, there is an initial
202 success response that indicates that the function was successfully invoked. The function will run separately in the background until it completes or it reaches the 15 minute execution limit.
The client browser won’t be waiting on the response, and completion of the background function wouldn’t inherently trigger anything for the client browser.
The endpoint that starts the function is appropriately told the equivalent of “ok, we’ll try and do that”… and then it executes, taking as long as it takes, and if it fails it retries.
It’s why in the example code it emails the user with a link to the report when it is ready.
You could engineer a solution that worked another way than “emailing the report”, for example polling or listening for a realtime value to see if the document had been created yet, but it’s not a case where the initial connection is automatically kept open and the resulting file can be piped back to the requester.
If you do want it to “prompt the user to get the file” when it’s finally ready, then you ultimately need two things:
- Knowledge that the file is ready
- The location of the file
So if you stored those details upon completion in a location that could then be checked, you would be able to display the prompt or trigger a download with your front-end logic.
AH, okay, in theory this would work:
- Client triggers PDF generating
- Function generates a PDF and uploads it to some public cloud storage bucket
- Client side JS polls every 2 seconds (for example) identifiable PDF location
- If successfully fetched, client-side prompt a download
Welp, I don’t love it. Especially the part where I’m polling a URL and (weirdly…) handling a 404 as a flag to re-fetch. I’ll have to talk this one through with my client to see what we can do.
Should the docs be updated to clarify that a successful call to a background function will never return anything other than a 202 back to the client? Or is that also not true?
As I’d mentioned in passing, you could avoid manual polling by watching a realtime value, for example with Firebase or Firestore.
Which would also prevent you needing to poll on the existence of the resulting file.
Upon the initial call you could insert a record that represents it being in the queue, and then update that record with the file location when it’s actually ready.
It would allow you to know there is a file being processed, and only make the request for it once it does exist.
UX wise you could make it all work much the same no matter how it’s done though.
Well that sounds pretty neat! I don’t know if I’ll have the expertise to pitch that as a solution, but good to know something like that would be possible.