Stream content to function response body

Hello,

I was wondering if it’s possible to stream files to the response body.
I don"t see any ways to do this with a netlify function:

server.on('request', (req, res) => {
  const src = fs.createReadStream('./big.file');
  src.pipe(res);
});

Thanks!

1 Like

Hi,

Thanks for writing in. Right now you can’t stream data to a function. You have to send the entire payload with the request.

Hello,

Do you have any plans on supporting that or a workaround ?

hi @armaldio - can you outline your use case a little bit more - how this would be advantageous, why you need this etc? The more information we have, the easier it is to make a case for this :slight_smile:

Sure,

I’m trying to stream a big file to gain access to progress features as the file is sent from a lambda to the browser and i’d like to show a progress bar

Right now, we don’t support streaming from a lambda function and we don’t have enough information to file a feature request. That said, how did you plan to implement what you describe?

I have a use case where I’m implementing a limited reverse proxy for a particular service to work around the fact one site enforces CORS and the service doesn’t support HTTPS. The images are small enough and I expect to have low-enough traffic I can get away with creating a single in-memory buffer and concatenating them all with Buffer.concat when it finishes (the images are all fairly small), but this is definitely suboptimal and it’d be easier to just return the stream. Streaming would also scale better and take less memory.

My code currently looks something like this:

"use strict"

const http = require("http")

exports.handler = async (event, context) => {
    const res = await once(http.get(url), "response")
    const chunks = []
    let size = 0

    function receive(buf) {
        size += buf.length
        chunks.push(buf)
    }

    res.on("data", receive)

    try {
        await once(res, "end")
    } finally {
        res.removeListener("data", receive)
    }
    
    return {
        // headers, status code, etc.
        isBase64Encoded: true,
        body: Buffer.concat(received, size).toString("base64"),
    }
}

function once(emitter, event) {
    return new Promise((resolve, reject) => {
        function pass(arg) {
            emitter.removeListener("error", fail)
            resolve(arg)
        }
        function fail(arg) {
            emitter.removeListener(event, fail)
            reject(arg)
        }
        emitter.once(event, pass)
        emitter.once("error", fail)
    })
}

Ideally, I’d use a rewrite rule and instructing Netlify to explicitly not cache the returned result as the backend server in question returns a random image that changes on each request, with headers explicitly stating to not cache the response. But since this functionality doesn’t exist, it’s much easier to just write a function than file a very specific feature request for that. And unlike OP, I do at least have the option of keeping the full response in memory for a short period of time.

If I could return a stream from the body knowing it’d get implicitly .piped to the response, I’d do this instead, removing about 50% of the code in my function’s handler:

exports.handler = async (event, context) => {
    const res = await once(http.get(url), "response")

    return {
        // headers, status code, etc.
        body: res,
    }
}

Thanks for chiming in here @isiahmeadows! We have an open request to support streaming from functions in our proxy. We’ll let you know once it’s been added. Thanks.

1 Like

Any update on the streaming response body PR ?

Heya @arlac77 - sorry to be slow, but I have put the question to our developers to understand if there are plans to change this yet! Hope to get back to you on it soon.

Hi, just to add some details: As you probably already know, we use AWS lambda functions under the hood. It looks like AWS lambda functions does not currently support streaming responses for Node or Golang based functions. Only Java, which is a language we do not support.

That said, we do have a feature request filed but there are some technical challenges that we will need to resolve before we’ll be able to implement this. We will update here when and if we do.

Thank you for your patience.

Hello Dennis, all,

I was wondering if there was an update on this at all? I looked into streaming support in AWS and I found this:

My apologies if this has been addressed elsewhere, my searches ended here. (and also, which links to here: Netlify Functions returns readable stream to browser)

Hey @keneucker,

Thanks for that. Reading those articles, it looks like it might be a big change for our systems to add that support, if at all that’s possible. So, it might not happen any time soon.

Bummer. Okay, thanks for the reply.

Hey, it’s 2023 - is / will streaming responses be supported (for any language Netlify functions use)?

My use case: stream textcompletion response from openai to client so the user gets immediate feedback vs waiting a few seconds for an entire response to be sent

Hi unfortunately this is not supported quite yet.

any updates on when this will be supported?

It’s already supported: Create functions | Netlify Docs