Best way to serve about 60 *.json files each about 1-3MB size?

Best way to serve about 60 *.json files each about 1-3MB size ?

I built a Website (Jekyll CMS). This website stored in Git/Bitbucket as private repository and I publish it using Netlify. Everything works. Part of this website - about 60 *.json files, each about 1-3MB size. These files are stored in Git and served to end-users through Netlify (as all other components of the website).
These files are changed 3 times per day, so there are 3 commits every day.

There are some problems because of these *.json files - Git repository grows with the time! Attempts to delete old versions of the *.json files (with BFG tool and “git gc/prune”) can not be fully automated - I have to ask admins of Bitbucket to run “git gc” on their servers (after running BFG on my machine).

And, actually, I do not need a version control for thes *.json files at all.
They just should be stored somewhere and accessibe to end users.

can I store these *.json files somewhere using Netlify infrastructure?
Some… file server?
Some… CDN?
Is Git LFS a good idea?
(specially for Git LFS - can I completely delete/purge old versions of the files by myself?)

Hi @stargazer33 :wave:t6:

Netlify supports using Git LFS for large files, so you can store your JSON files in Git LFS and configure Netlify to serve them directly from there. Large Media setup | Netlify Docs

There are many options for serving your large files outside of your Git repo, but the best option will depend on your specific requirements and constraints.

Can these json files be generated with Jekyll during build time instead of saved to git? The Ruby to_json function together with a .json.erb extension can be pretty powerful.

Tom’s suggestion might be mine too.

I would NOT recommend using Large Media (sorry to contradict my colleague Sam) since these files are not very large and that feature has some quirky behavior so we don’t suggest it unless you need to store tons of 500MB pdf’s in your repo.

Shouldn’t take us too long to clone your repo, even if it has a lot of JSON files in it, so I’d just have them on the website that needs them.

Do keep in mind that we don’t want to host JUST your JSON files - it is against our terms of service to store files for computers rather than humans. So hopefully whatever is consuming that JSON is also a netlify site? If not, the answer is “store them somewhere else, please”, since we intend to host websites for humans to read :slight_smile:

1 Like

These *.json files are definetely part of my website! and the website is on Netlify. Just the *.json files have different size/update frequency than other parts of the website.

These *.json files can NOT be generated by Jekyll (I have a separate backend program generating them 3 times per day)

Well, in the meantime… I found out I can upload these *.json files DIRECT to Netlify without storing them in Git (or in Git+LFS). Something like

netlify deploy --prod --site ... --dir ...

So my idea is:

  • create a subdomain:
  • upload *.json files to subdomain every time they change: netlify deploy --prod --site --dir ...
  • the main site (Jekyll) just loads these *.json files from subdomain. Something like <link href="">

This way I can update the site (Jekyll) and the *.json files separately.
And I do not store copies of JSON files in Git.

Do you see drawbacks/disadvantages of this approach?

UPDATE: it seems, the approach with works! But…
How to specify Access-Control-Allow-Origin in _headers for multiple (sub)domains? I have a few (sub)domains:

And I want all these subdomains use data from And just using a wildcard like in Access-Control-Allow-Origin: * is too broad, would prefer to restrict access to and subdomains…

@stargazer33 it looks like you found the thread I was going to share :slight_smile: Access-Control-Allow-Origin to multiple domains - #2 by luke Could you give that a try and let us know if it helps?