Best way to serve about 60 *.json files each about 1-3MB size ?
I built a Website (Jekyll CMS). This website stored in Git/Bitbucket as private repository and I publish it using Netlify. Everything works. Part of this website - about 60 *.json files, each about 1-3MB size. These files are stored in Git and served to end-users through Netlify (as all other components of the website).
These files are changed 3 times per day, so there are 3 commits every day.
There are some problems because of these *.json files - Git repository grows with the time! Attempts to delete old versions of the *.json files (with BFG tool and “git gc/prune”) can not be fully automated - I have to ask admins of Bitbucket to run “git gc” on their servers (after running BFG on my machine).
And, actually, I do not need a version control for thes *.json files at all.
They just should be stored somewhere and accessibe to end users.
can I store these *.json files somewhere using Netlify infrastructure?
Some… file server?
Is Git LFS a good idea?
(specially for Git LFS - can I completely delete/purge old versions of the files by myself?)
I would NOT recommend using Large Media (sorry to contradict my colleague Sam) since these files are not very large and that feature has some quirky behavior so we don’t suggest it unless you need to store tons of 500MB pdf’s in your repo.
Shouldn’t take us too long to clone your repo, even if it has a lot of JSON files in it, so I’d just have them on the website that needs them.
Do keep in mind that we don’t want to host JUST your JSON files - it is against our terms of service to store files for computers rather than humans. So hopefully whatever is consuming that JSON is also a netlify site? If not, the answer is “store them somewhere else, please”, since we intend to host websites for humans to read
These *.json files are definetely part of my website! and the website is on Netlify. Just the *.json files have different size/update frequency than other parts of the website.
These *.json files can NOT be generated by Jekyll (I have a separate backend program generating them 3 times per day)
Well, in the meantime… I found out I can upload these *.json files DIRECT to Netlify without storing them in Git (or in Git+LFS). Something like
netlify deploy --prod --site ... --dir ...
So my idea is:
create a subdomain: jsonfiles.example.com
upload *.json files to subdomain every time they change: netlify deploy --prod --site jsonfiles.example.com --dir ...
the main site example.com (Jekyll) just loads these *.json files from subdomain. Something like <link href="jsonfiles.example.com/file1.json">
This way I can update the site (Jekyll) and the *.json files separately.
And I do not store copies of JSON files in Git.
Do you see drawbacks/disadvantages of this approach?
UPDATE: it seems, the approach with jsonfiles.example.com works! But…
How to specify Access-Control-Allow-Origin in _headers for multiple (sub)domains? I have a few (sub)domains: staging.example.com dev.example.com example.com
And I want all these subdomains use data from jsonfiles.example.com. And just using a wildcard like in Access-Control-Allow-Origin: * is too broad, would prefer to restrict access to example.com and subdomains…