Question about storage and rate limits on node.js based Netlify functions

I was thinking to write a Netlify function to handle some image processing tasks. But the node_modules of deps is around 450mb after npm install. I learned that there is a storage limit on Netlify functions, so just wondering if the size limit counts toward node_modules.

Since there is no login protection for the API idea, I also wonder if there is a way to rate limit hits to Netlify functions without user login(prevent people hitting the netlify functions 1000+ times from one random IP)

hi there,

where did you hear about the storage limit for netlify functions? I’d like to verify that information before we move forward :slight_smile:

AWS’s limit:
https://dzone.com/articles/exploring-aws-lambda-deployment-limits

Deploy limit for netlify has various random sources, such as, https://answers.netlify.com/t/failed-to-upload-functions-file-function-too-large/3549

It’s unclear if the limit is 50mb or 512 mb in this case, either way i think 450mb is quite large to be deployed as a netlify function.

It is also unclear if deploy size means the app ile sizes or all deps included.

Best regards,

Tim

So the limits mentioned in the other thread is 50MB for a zipped function. Our buildbot will always zip up your functions for deploy so that’s what you have to work with. That zip file will include all the dependencies that you import in your function.

As for having some sort of access control, have you considered implementing basic auth on select paths?

I spent some time looking into the issue. My test function was an image processing algorithm that simply overlays test image on top of input image. I had a working prototype working in Node.js via node-canvas.

But it turned out that the node-canvas dep is as large as near 100mb and there isn’t an easy way to install imagemagick as part of the dep installation process. On AWS Lambda 1.0, Imagemagick was actually an included dependency, but for 2.0, since it supports Docker image layers, it no longer comes installed with image processing libraries. Unless Netlify can support the AWS Lambda layers via configuration, it seems impossible to include either node-canvas or imagemagick type of binary deps as part of the function zip bundle.

Here’s an example of what Lambda layer is, Application Search - AWS Serverless Application Repository

Ah, that’s interesting. I wonder if manually zipping your function with the included imagemagick static binary (which you will provide and if it’s not too large) help you work around your issue. In my example here: function-deploy-test/zipped-function.js at master · netlify/function-deploy-test · GitHub, I give an example on where the file you include in your function would be and how to address it. And this: function-deploy-test/package.json at master · netlify/function-deploy-test · GitHub shows you how I went about zipping up my function manually.

Let me know if that makes sense.

Hi Dennis,

The problem with that approach is that Graphicmagick binaries seem to require building from source on the OS, (tried several instructions on static linking the binary and I even made a Amazon Linux 2 Docker image to build it from source before zipping). It will complain about missing libJpeg.so files even if uploaded. If you happen to have time to play with it and end up with different results, feel free to let me know.

Here is a few routes I tried to build-it-and-zip-it and failed:

  1. Copy the binary from ubuntu /usr/bin

  2. Build in Docker Image of AWS Linux 2 and upload zip it

  3. Try to build it as part of the function(terrible idea) but it is read-only env anyway so it does not work either

  4. Build graphicmagick as part of the build pipeline and copy it into functions directory, it doesn’t quite work either since Amazon Linux 2 seems to be a different environment.

And they were very close to 50mb, once I try to include more .so files in the bundle, it can get to ~70mb. It got into a complicated Imagemagick building problem on minimizing the number of modules you need for your app, which is way too complicated for a simple use case.

At the end, I realize that this is unnecessary for some basic image processing. I ended up doing another Golang experiment for simple use cases(total golang binary size is only around 10mb).

I can try imagemagick and see if it works out differently but still, it looks like the zip-it-and-ship-it a node.js function without AWS Lambda layers is not a good idea if you want to leverage large dependencies like Imagemagick or Node-canvas(100mb). With Lambda layers, this would be provided by AWS, so it’s definitely worth looking into from Netlify perspective since image processing + upload to S3 was actually one of the example payloads when AWS first launches AWS Lambda.

Best regards,

Tim

Right, building a static binary of graphicmagick would probably not be the best approach considering all the hoops you’d have to go through.

I’m not sure when or if we will consider Lambda Layers, but I’ll get a feature request filed for it and will update you if we do end up supporting it.

@Dennis I have a silly idea… I can probably use Netlify function as a passthrough proxy to an actual AWS Lambda function + with Lambda Layers set up. Besides setup cost, I am not sure if there is a way to save some of the Data Transfer cost if request comes from Netlify functions.

“Data transfers are free if you are within the same region, within the same availability zone. Data transfers within the same region, but in different availability zones, have a cost associated with them.”

What are AWS region + Availability zones for Netlify functions?

Netlify functions are deployed to us-east-1 as mentioned here: Functions overview | Netlify Docs. That idea, while ‘silly’ sounds like it could possibly work. Just note that Netlify functions have a 10 second runtime limit, so you’re other function will need to run within that subset time frame.

Let me know if it works. :smiley: