Home
Support Forums

Any downsides to multiple proxies?

General questions about proxying multiple repos to different subdirectories on a root domain.

  • Are there any performance issues as compared to multiple subdomains?
  • What about pricing differences?
  • And finally, is it the done thing?

My reading thusfar:

Quick background - was an early adopter of Netlify (late 2015), used it for various sites, but nothing for a while. We use Firebase hosting for our current SAAS since we’re using various other Firebase features. The proxy feature would be the “killer feature” for us as we’re starting to bump into a few issues of a single deployment for the entire app. Splitting our product would be v. beneficial.

Hi @oodavid,

Happy to answer the questions, but I must say I haven’t fully understood those. Would you please elaborate a bit?

Yes, proxying is going to be slower than directly using a domain. The reason is additional round trips. Browser will send a request to Netlify CDN → Netlify CDN will match it against a proxy rule → Netlify will send a request to that destination → Receive content from there → Send it back to the browser.

When you use a domain directly, the browser requests the content from the CDN → the CDN serves it. The additional compute and transfer time is going to lead to degraded performance in a head to head competition.

Differences as compared to what?

Done as in, do you mean usable? If yes, then yeah proxying can be used for various use cases. It has its own set of limitations. So, if you have some expectations about that feature, jot those down so we can advise if that’s possible or not.

Is this true even for Netlify proxying Netlify?

Also, re: “slower”, are we talking rendering jank or negligible? A slight hit doesn’t worry me :slight_smile:

I mean compared to using subdomains; I’ve been stung by AWS network charges for moving data between regions and services in the past.

I’m trying to figure out the pros and cons of moving from a monolith CI/CD approach to microsites.

My initial search was for a service that allows subdirectory deployment without affecting other areas of the site. My current frustration is that we have a number of discrete parts of our product that I’d rather have engineers working on in isolation. It’s not great to have messed up deployments for something like merges not being clean. In fact, most of the arguments in Building Large Sites on Netlify apply to us.

Cheers for your time!

EDIT:

Now all requests to /api/... will be proxied through to https://api.example.com straight from our CDN servers without an additional connection from the browser. If the API supports standard HTTP caching mechanisms like ETags or Last-Modified headers, the responses will even get cached by our CDN nodes.
~ Rewrites and proxies

Does this mean the caching would be in effect when proxying a Netlify domain?

Yes, it’s true for that case too.

The rewrites to static assets are stored in the CDN cache, so only the initial load might be slower. Subsequent loads might get faster and be served at acceptable speeds.

When proxying, you’d end up paying double for bandwidth, with a direct subdomain, this won’t be the case. It won’t exactly be double, but you can still expect it to be roughly same. Basically, every request that’s not yet cached in the CDN will charge for double bandwidth.


With the explanation you provided about the use case, it’s possible and achievable. Nothing super problematic that stands out at the moment.

Yes, as I said above in this response, proxied content (static) is cached in the CDN.

Brilliant!

I think for our use-case (weekly deployments, SPA) the Edge caching will be a huge boon. We’ll discuss moving to netlify tomorrow and do some tests. I reckon it’ll be pretty quick and painless.

Thanks again Hrishikesh