Uploading large files to S3 bucket via Netlify function

We want to collect images and videos from users. We created a lambda function that uploads them to AWS S3 Bucket. However, because of the 10s limit, the function times out when the file is large or the connection is slow. What is the best practise?

@michal have you considered using a service like Filestack instead – something that specializes in large uploads? It provides a nice UI as well.

There is a cookbook example on the RedwoodJS site that demos this.

How exactly does it solve the problem? If we implement it directly in Vue, we will be exposing tokens publicly (instead of Amazon S3 it will be just Filestack tokens). If we implement it in Netlify functions, we will be limited by 10 seconds anyway. Or did I miss something?

@michal Great questions and Filestack has that covered and it will in fact require Netlify function to fully secure.

Have a look at the Security section of Filestack’s docs. There are two ways the secure requests: policies and domain whitelists.

Requests are authenticated by checking a policy string that is signed by a shared secret.

Authentication and authorization against our APIs relies on Base64URL-encoded JSON “policies” and HMAC-SHA256 “signatures”. The policy determines which actions are authorized and the signature authenticates the policy. Depending on the API, these values may be required as part of the path, query parameters, or body of a request.

The secret used for signing is automatically generated for each application. The secret should be carefully protected and never exposed client-side. A secure application stack requires backend code that generates and signs short-lived and limited-scope policies for clients.

So, you’re right that on the client side you’d expose the API key etc, but you’d create a Netlify function that uses a protected env secret that generates a token valid for only a short while that can be used to upload files – or any other action. Your function would provide that to the Filestack client.

You can enforce lots of rules on the policy: size of upload, the path, prevent upload overwrites, etc etc.

The second layer of security is that you can whitelist

Domain whitelisting prevents File Picker from being embedded on unapproved websites. Whitelisting works by blocking requests that don’t contain an approved domain in the “Origin” header. It’s one way of securing your solution and your resources, so others don’t attempt to piggyback on your account.

And they do say that domain whitelisting isn’t enough – should do both.

Now that I think about it, if you implemented and rolled-your-own function to upload, you’d also have to secure it in some way to prevent it being used by the outside world. Otherwise anyone could post to the endpoint.

And you’ll also have to secure the function that generates the secret. Assuming the file upload is from an authenticated user, that can authenticate (in the Netlify event header or context depending) and then check a role permission of “generate:upload-token” or something along those lines.

Hope that helps/makes sense.

1 Like

The solution that @dthyresson described is probably the best solution that I am aware of as well. :+1: Dynamically generating the a short-lived token in a function and is very similar to how JWT works.

1 Like

Thanks @Dennis!

I think if I was using Netlify Identity, in the function to generate and return that short-lived Filestack token, I’d be sure to check the context to see that an authorized request is being made (ie, don’t just let anyone make a call to make tokens) – or same with any other Auth provider.

99% of the functions I do – which often respond to external webhooks – take a signed JWT and check issuer, audience and expiration before running … and return 401 if it fails verification.

1 Like

That’s a good point! I also recommend doing what @dthyresson described. :slight_smile: