Document Exceeds Maximum BSON Object Size

Hello everyone,

I’m encountering an error during a push operation, and I could use some help figuring it out. Here’s the error message I’m seeing:

⠹ CDN diffing files... ›   Warning: JSONHTTPError:  500
 ›   Warning: 
{
  "name": "JSONHTTPError",
  "status": 500,
  "json": {
    "error": "The document exceeds maximum allowed BSON object size after serialization (on db-origin-aws-cmh-prod-12.node.cmh-prod.nf.internal:27017)"
  }
}

  • Site Name: https://mc-map-endlessearth.netlify.app/
  • CLI Version: netlify-cli/17.34.2 linux-x64 node-v22.6.0

Either your _redirects or _headers are too many in number causing this issue.

hmm, but I didn’t apply any header or redirect rule, still can’t upload

I see the 500 you mention in the logs, but I don’t see the exact file that’s causing this. I’ve asked the devs to check and will let you know.

1 Like

Just checked with the devs and they wonder if you’re uploading too many files or files that are nested deep into directories? Does deploying via Git work?

Actually, I use real-time generate (at local generate), so it didn’t commit at git.
Maybe there is a limit to the file amount? or cli tool can’t find the diff between too many files.

if you’re uploading too many files or files that are nested deep into directories


Yup, I think is a little too deep

Oh, I think I know what the issues are.


When I look around the docs, I notice there are no file number limits, but there have directory file number limits.

As long as you can generate whatever you need using some command on an Ubuntu machine, you should be able to deploy via Git.

Do you have more than 54k files in any folder though?

Yup, is up to 100k
But I am thinking how it calculate, in maps/world/ count all sub dir and files? or only count pre dir

Too big, it will hit git 5GB limit (at Github), but I didn’t push at git, so I am not sure it will hit the repo limit

GitLab provides up to 10 GB, I believe. In any case, if your directory has over 54k files then even deploying via Git won’t work. I don’t know if files from sub-folders are also taken into account in that limit, so let me confirm that.

1 Like

I’ve run into similar issues before. My site has 500k unique pages, although each are in their own “folder”
for example:
/pages/page1/index.html
/pages/page2/index.html
/pages/page3/index.html

When deploying for ~200k pages - this worked fine, however when I scale it up, it freezes. Not to mention the total weight of all pages comes into 28GB. Is there any way to compress and upload?

I can confirm that the limit is 54k / directory. Sub-directories work fine. Here’s the code I used to test this: github.com/hrishikesh-k/f-124620. I created 150k files (30k in each of the 5 directories). All of this was done in nested directories (but can also be done in flat style by changing the --recursive flag in package.json to --flat).

The deploy works fine (although takes a really long time to upload so many files).

So, it is possible to upload many files as long as a single directory file count doesn’t go over 54k. Can you all try using the above repo to see if you’re able to deploy? Maybe you can change the numbers in there to get a better representation of the number of files in your deploy.

I know the deploy’s size is not an issue for us, so 28 GB shouldn’t matter either.

Okay, I will slightly alter your approach and implement it in Python to generate files with a specific size that mimics our structure, and then try to upload with that. Also, 150k seems fine, we’ve gone ~167k folders as max I believe, but once we try 200k it breaks. So I will try out all these scenarios and get back to you with the results!

Hi @hrishikesh
So I have created a script that works according to our needs and simulates a similar load, as well as naming structure. Whilst I tried your script and it worked with a few hundred thousand charities, here is another script I’ve written up in Python, Python directory creator · GitHub that will create a directory with 900,000 subdirectories each holding an index.html file with a small size of 4kb. Please try to upload with this, because when I try it does throw the same error at me during the upload stage.

Are you trying to upload 900k files though? Trying to upload this is likely going to take hours, so I’d rather simulate an issue close to the results you want. If you’re going over 200k files and that’s failing, I don’t see a good reason to test 900k files. It’s a huge difference.

Even if 900k files fail, it’s very possible that could be hitting some other system limitation that we haven’t discovered yet, quite frankly because no one tried to deploy 900k files before, definitely not using the CLI.

I tried deploying via Git: Deploy details | Deploys | f-124620 | Netlify and that failed with a similar issue. So this error is not even limited to CLI at this point with a file count that high.

In reality, yes it is possible we hit over 900k nodes, maybe not all in one folder, but for the same number in one folder, we have an equal amount of /page-data/ files, and misc pages, plus assets for all 600k pages, so in that regards the total upload more than likely exceeds 900k. It may not be all in one directory - but it is spread out.