Lighthouse Background Function - Request must be smaller than 69905067 bytes for the CreateFunction operation

I’m trying to run a background function that uses lighthouse, which ultimately depends on chromium binaries. Since chromium/puppeteer is too large to bundle with the function itself, I’m using Netlify Plugin Chromium to install chrome binaries and set the chrome path in the environment variables during the build. This seems to work as expected:

1:05:20 PM: [NetlifyChromiumPlugin]: Setting environmental variable CHROME_PATH to /opt/build/repo/node_modules/chromium/lib/chromium/chrome-linux/chrome
1:05:20 PM: [NetlifyChromiumPlugin]: Chromium installation finished with SUCCESS (path: /opt/build/repo/node_modules/chromium/lib/chromium/chrome-linux/chrome)
1:05:20 PM: ​
1:05:20 PM: (netlify-plugin-chromium onPreBuild completed in 32.2s)

However, when I run the background function (initiated from the front end), CHROME_PATH is undefined, which results in a runtime error.

1:40:58 PM: 5e06caee ERROR  Invoke Error 	{"errorType":"Error","errorMessage":"The CHROME_PATH environment variable must be set to a Chrome/Chromium executable no older than Chrome stable.","code":"ERR_LAUNCHER_PATH_NOT_SET","message":"The CHROME_PATH environment variable must be set to a Chrome/Chromium executable no older than Chrome stable.","stack":["Error","    at new LauncherError (/var/task/node_modules/chrome-launcher/dist/utils.js:26:22)","    at new ChromePathNotSetError (/var/task/node_modules/chrome-launcher/dist/utils.js:33:9)","    at Object.linux (/var/task/node_modules/chrome-launcher/dist/chrome-finder.js:128:15)","    at Function.getFirstInstallation (/var/task/node_modules/chrome-launcher/dist/chrome-launcher.js:116:51)","    at Launcher.launch (/var/task/node_modules/chrome-launcher/dist/chrome-launcher.js:152:43)","    at Object.launch (/var/task/node_modules/chrome-launcher/dist/chrome-launcher.js:37:20)","    at audit (/var/task/functions/lighthouse-background.js:63:39)","    at Runtime.GradePages [as handler] (/var/task/functions/lighthouse-background.js:34:26)","    at processTicksAndRejections (internal/process/task_queues.js:97:5)"]}

The function also doesn’t work if I hardcode the path in the background function before initiating a deployment. I’m beginning to wonder if I’m misunderstanding the use-case for this build plugin or there is something larger related to background functions I’m missing here. Should I be using a different strategy to tackle this? How can I bundle the required chromium binaries for use in this function without exceeding the AWS lambda file size limit? Any guidance would be appreciated.

Hey there! Unfortunately, I can’t speak for the plugin (you may want to ask its owner) but I can share with you our how to use env vars correctly on Netlify guide.

Please do consider this and let me know if you come unstuck!

Hey @Scott ,

Thanks for the reply. I did learn shortly after posting this that environment variables generated during build time are not automatically packaged with functions uploaded to AWS Lambda. There’s another build plugin to inline those env variables in uploaded functions, but around that point it occurred to me that I was going about this all wrong.

I must upload the chromium binary along with the function for what I’m attempting to do to work because, please correct me if I am wrong, lambdas do not have access to
/opt/build/repo/node_modules/chromium/lib/chromium/chrome-linux/chrome anyway.

So the build plugin I originally referred to is not my problem, and I won’t need an environment variable after all. My real problem is that I can’t figure out why I’m going over the AWS Lambda gzip bundle size limit of 69905067 bytes (or 69.9 MB) when:

  1. Others have succeeded in doing what I’m attempting with AWS Lambda functions elsewhere e.g. AWS Lambda directly, Serverless, Headless Chrome on Netlify
  2. The sum of the size of each of the individual packages I’m using doesn’t exceed the aforementioned size limit as far as I can tell

A basic implementation of this looks something like this. It relies on:

  1. chrome-aws-lambda 49.7MB
  2. puppeteer-core (peer dependency of chrome-aws-lambda) 2.75 MB
  3. chrome-launcher 107KB
  4. lighthouse itself 12.3 MB.

A few caveats:

  • The lighthouse size should actually seem somewhat inflated as well, because that includes multiple methods for running it besides lighthouse-core, although I’m not sure how tree-shaking with zisi works exactly.
  • my particular lambda also has a dependency on ioredis, which is also relatively small at 294 KB.
  • The lambda function itself includes a couple of methods, so the file size is about 8 KB

This adds up to 65.2 MB. Is it possible that zisi is adding 4.7 MB of runtime code to the bundle? Should esbuild make a difference? Is there any way I can determine what the size of an uploaded function will be locally before initiating a build?

I did find this note on an archived package similar to the one I linked above:

  • The file will be big (at least 75MB), so you need to upload it to S3 then deploy to Lambda from S3.

Does this mean anything in the context of Netlify background functions? It’s worth noting that this package is likely archived because it depended on @serverless-chrome/lambda, which downloads chrome binaries directly. chrome-aws-lambda, on the other hand, ships pre-packaged with a version of chromium that has been specifically pared-down to run in a lambda environment. It nows seems to be the more favored package.

Thanks in advance for any insight you may be able to provide here!

EDIT: added Netlify labs example of headless Chrome running in a function

Yes, chances are esbuild might help. You can try running a build locally with Netlify CLI to see the possible bundle size of the function.

Thanks for that! I didn’t realize functions are added to .netlify when built locally.

So unfortunately esbuild didn’t do the trick. What’s interesting is that the zipped function is even smaller than the estimate I gave above (perhaps because of esbuild). Depending on whether or not I mark just chrome-aws-lambda or both chrome-aws-lambda and lighthouse as externals, it’s either 53.5 MB or 61.1 MB when built locally. That’s under the limit specified, but it still throws the error. This seems very similar to the error encountered in this thread.

It’s good to know it helped, even if not completely. But, as mentioned in that thread and here:

,

the zip file should be below 50 MB. I’d assume it’s Lambda that needs some extra overhead and thus the limit shows ~70 MB, but the zip that you send over should be within 50.

Thanks for that bit of context! If nothing else, that gives me a target to work towards. Is there any way to take advantage of uploading it to s3 then deploying it to lambda through Netlify?

Also, do you think your team could make a note to improve the error handling for this? Right now when a build fails the output looks like something like this:

5:06:40 PM: Request must be smaller than 69905067 bytes for the CreateFunction operation
5:06:41 PM: Request must be smaller than 69905067 bytes for the CreateFunction operation
5:06:42 PM: Request must be smaller than 69905067 bytes for the CreateFunction operation
5:06:44 PM: Request must be smaller than 69905067 bytes for the CreateFunction operation
5:06:46 PM: Request must be smaller than 69905067 bytes for the CreateFunction operation
5:06:50 PM: Request must be smaller than 69905067 bytes for the CreateFunction operation
5:06:56 PM: Request must be smaller than 69905067 bytes for the CreateFunction operation
5:06:59 PM: Request must be smaller than 69905067 bytes for the CreateFunction operation
5:07:11 PM: Request must be smaller than 69905067 bytes for the CreateFunction operation
5:07:23 PM: Request must be smaller than 69905067 bytes for the CreateFunction operation
5:07:42 PM: Request must be smaller than 69905067 bytes for the CreateFunction operation
5:08:15 PM: Request must be smaller than 69905067 bytes for the CreateFunction operation
5:08:15 PM: Failed to upload file: lighthouse-background

Similarly, the local build output looks like this:

⠸ (0/1) Uploading lighthouse-background... ›   Warning: JSONHTTPError:  422
 ›   Warning: 
 ›   {
 ›     "name": "JSONHTTPError",
 ›     "status": 422,
 ›     "json": {
 ›       "errors": "Request must be smaller than 69905067 bytes for the CreateFunction operation"
 ›     }
 ›   }
 ›
⠴ (0/1) Uploading lighthouse-background...    JSONHTTPError: Unprocessable Entity

I assume the duplicate errors are from the retries. I think it would be a good idea to catch this and display an error noting that 50 MB is the limit instead. Alternatively, an additional error could be logged right below 5:08:15 PM: Failed to upload file: lighthouse-background that notes the actual limit.

I’d not think so. No matter how you upload, I think Lambda will always just accept just 50 MB max.

Yes, we’d pass this on to the team. Hopefully, they can improve it for the future.

This doesn’t seem to be true based on an expanded view of the screenshot and link you provided above:

Admittedly, this is not super clear, but if this article is to be believed, it’s possible to take advantage of a much larger file size limit (250 MB) by uploading to s3 and then deploying from there. From the article:

We’ve found the true limit of the size of the uploaded deployment package: it’s the 250 MB uncompressed deployment code/dependencies limit. In other words, we can upload, via S3, any Lambda function zip who’s extracted contents are less than a combined 250 MB.

And:

Unfortunately, some tools like Apex don’t currently support the S3 method. I’m only aware of Serverless using S3 when deploying Lambda functions—because of its use of CloudFormation.

This explains why I was able to find multiple examples of people successfully deploying virtually identical lambdas to my own around the web. Some of the examples I found were ultimately deployed to Serverless while others were deployed to AWS with the s3 workaround. At the time, I was unaware Serverless was also aware of, and took adavantage of, this workaround.

It may be worth it to consider replicating this method if possible. The flexibility may prove to be useful for use cases such as the one that spawned this thread.

Yes, we’d pass this on to the team. Hopefully, they can improve it for the future.

Thanks!

I should have rephrased it. What I meant was, since Netlify will accept only zipped versions of JS functions, I was assuming you’d be zipping the functions even when you upload it elsewhere. Sure unzipped you can have 250 MB, but I personally don’t know how to upload unzipped functions.

Alright, we can pass that to the team, but the actual implementation of the same might take time.

1 Like

Totally understandable! Please let me know if there’s anything I can do to help e.g. if you’d like me to more formally request this as a feature somewhere, or if you or someone else at Netlify would like to discuss this in further detail.

Thanks again for shining some light on some of the platform’s implementation details, and thank you for being responsive and open to ideas!

We’d discuss it with the team internally, no action would be needed from your end. We usually file the feature requests too, however this is the first time such a thing is actually being requested. So, I’m not sure if the platform is ready to handle such a thing or if it’s going to have an impact that I can’t see it.

Nevertheless, the team would consider this and keep it on the radar so it can at least know that yes, there have been requests for this.