We are migrating a site from Aerobatic (https://www.aerobatic.com/) which has been a great service for us, but it unfortunately is shutting down.
The site allows users to log in to be able to view daily updated data. The data resides in several large JSON files (about 50 files of 2.5 MB JSON each) which is currently stored gzip compressed in S3.
We are currently leveraging the Auth0 and S3 plugins of Aerobatic. I’ve been able to integrate with Auth0 and Netlify RBAC using Netlify functions, but I’m struggling to find a good way to serve the JSON files.
First approach was to use Git LFS and Netlify Large Media, but that requires the backend that generates the files to log in both to GitHub and Netlify, which is very cumbersome, as this runs as a Kubernetes cron job using a Docker image.
Second approach was to write a Netlify function to serve the files from S3, similar to the Aerobatic S3 plugin (https://www.aerobatic.com/docs/plugins/s3-proxy/), taking care to forward cache related headers in both the request and response, but it I hit two problems:
- The response has to be a JSON string, while I just want to stream the binary (gzip’ed) body from S3.
- RBAC cannot be applied to Netlify Functions (which I can work around by implementing authentication).
Does anybody have a suggestion for how to accomplish this?
Øyvind Matheson Wergeland