Redirects don't work for LLM Streaming

I’m using the _redirects fie for proxying API requests which return 'content-type': 'text/event-stream; charset=utf-8' responses. Site Deploy is at

I’m noticing two things:

  1. The netlify response is not streaming. It arrives in a single chunk.
  2. If the response gets too big (takes too long), the netlify response is a 502 bad gateway

Given the nature of proxying to services like OpenAI API, longer streamed responses are becoming more common.

I already moved away from netlify functions because of 10s limit. Can you confim what the limit is on proxying for config based redirects? Given your technical architecture and cost structure, is this something hard or cost prohibitive to support?

My ask is either:

  1. Remove or change limit so proxying longer running LLM streams doesn’t fail.
  2. Include documentation that proxy redirects (and edge functions) have limited capability for LLM response streaming

FYI this was same time as

We recommend Edge Functions for streams: Long-running Edge Functions | Edge Functions on Netlify ( as they can theorotically run for an infinite time.

Thanks for your reply @hrishikesh. If I understand you correctly, using _redirects to proxy a request to an endpoint that returns event-stream is not supported? (Or should be done via edge function?)

Not sure how your API is sending the response, but we do cut connections that are idle (not sending any data). Edge Functions on the other hand can handle this.