Netlify function with puppeteer breaks if I make any changes

So after hours of battling what settings to use I finally got my function to work but now if I change even a single setting, like removing a space, the code starts producing an error after a new build.

My function utilizes puppeteer-core (9.1.1), chrome-aws-lamba (9.1.0), discord.js (12.5.3 had to downgrade to get it to work with node 12). I run esbuild with node_bundler in the netlify.toml to keep it under the 50mb limit. I use NODE_VERSION=12 and AWS_LAMBDA_JS_RUNTIME=nodejs12.x along with the 2 other discord envs in the netlify UI.

All the documentation points to needing AWS_LAMBDA_JS_RUNTIME=nodejs12.x as a env but that doesn’t seem to change anything and keeps installing node as 16 BUT when I add NODE_VERSION=12 that is what downgrades the node version.

I am using these versions of packages because that is what I finally got it working with. I am open to more options but I just find it weird I can’t even change the discord env or remove a space from the code without it breaking.

Last working deploy

Changed the WHATCHANNEL_ID env in the netlify UI and retriggered a deploy and it no longer works

GitHub of the last working code when the function works.

The error that happens IF anything changes is

ERROR  Invoke Error 	{"errorType":"Error","errorMessage":"Failed to launch the browser process!\n/tmp/chromium: error while loading shared libraries: cannot open shared object file: No such file or directory\n\n\nTROUBLESHOOTING:\n","stack":["Error: Failed to launch the browser process!","/tmp/chromium: error while loading shared libraries: cannot open shared object file: No such file or directory","","","TROUBLESHOOTING:","","    at onClose (/var/task/node_modules/puppeteer-core/lib/cjs/puppeteer/node/BrowserRunner.js:194:20)","    at Interface.<anonymous> (/var/task/node_modules/puppeteer-core/lib/cjs/puppeteer/node/BrowserRunner.js:184:68)","    at Interface.emit (node:events:539:35)","    at Interface.close (node:readline:586:8)","    at Socket.onend (node:readline:277:10)","    at Socket.emit (node:events:539:35)","    at endReadableNT (node:internal/streams/readable:1345:12)","    at processTicksAndRejections (node:internal/process/task_queues:83:21)"]}

If I change an environment variable for my discord channel (without touching the code) and trigger a new deploy that will produce the same error. I cannot change any settings or breaks.

I have this feeling it’s because I am on a Mac 99% of the time but when I was messing with this at my house I originally got it working on my Windows PC. I think there must be an issue with node modules installed from Mac vs Windows but maybe someone knows how to get around this?


Hey there, @Kidron :wave:

Thanks for your patience here. I believe that you are blocked due to your dependencies being outdated. For example, puppeteer is now running on 18.2.1. Here is some documentation outlining puppeteer versions.

Additionally, you will no longer be able to use or depend on node 12. We have announced this change here: Netlify CLI: Dropping support for Node.js version 12. I have shared this feedback with the Docs team so that they are aware you encountered difficulties with this.

Please update all of your dependencies to the latest version and let us know if this unblocks you. Thanks!

Seems that the chrome-aws-lambda package is not working anymore in Netlify. The package seems to be more or less unmaintained. Don’t know if the issue exists on that package or in the build image Netlify is using, but either of those needs to be updated. Just waiting either of those to be fixed.

In the meanwhile I ended up to temporarily fix this by updating my endpoints to use browserless ( Just change your code to connect puppeteer into endpoint where puppeteer is running instead of trying to use puppeteer in Netlify.

Of course on large volumes this might raise your costs and make the endpoint a little bit slower than running puppeteer in Netlify.

Thanks for sharing this, @villepie!

I was able to find a solution and stay on Netlify. I used the package GitHub - Sparticuz/chromium: Chromium (x86-64) for Serverless Platforms instead of chrome-aws-lambda. This allows me to use the latest version of puppeteer-core and Node 16. Zero issues running this on Netlify so far.

If anyone else finds this, make sure you read the docs on which version you need to use of puppeteer.

puppeteer ships with a preferred version of chromium. In order to figure out what version of @sparticuz/chromium you will need, please visit Puppeteer’s Chromium Support page.

For example, as of today, the latest version of puppeteer is 18.0.5. The latest version of chromium stated on puppeteer’s support page is 106.0.5249.0. So you need to install @sparticuz/chromium@106.

Thank you for coming back and letting us know.

Hey there, @Kidron :wave:

I hope you are doing well! I wanted to follow up here and let you know that we made a change to our documentation to reflect that you now need to use node 16 or higher. Thank you again for surfacing your feedback – this helps us improve!

Happy building :rocket:

@Kidron I’m trying this solution but i’m going about 5mb over the 50mb limit. Did you not hit this?

Hey Wes! Big fan!

I did find a way past this. I’ve noticed it’s been hit or miss on deploying serverless functions with the full version of puppeteer as a dev dependency. Some of the functions I’ve spun up keep the limit under 50MB and some don’t. What I did to get around it was to remove puppeteer as a Dev dependency and I have had no issues doing it this way.

When I was testing everything before I came to my solution I was using esbuild in my netlify.toml and that allowed me to use puppeteer and keep my function under 50mb. That may not be necessary now but specifying that in the netlify.toml did make a difference when it was building the function.
Link to doc Modern, faster Netlify Functions: New bundler and JavaScript features

  node_bundler = "esbuild"

In my package.json I have Puppeteer-core, sparticuz/chromium, discord.js, and dotenv and that didn’t have any issue going over the size limit.

Hopefully, that helps you, let me know either way! Excited to see what you are building.

Thank you! That should help a lot!

@Kidron :slight_smile: Thanks for the reply. Do you mind posting exactly which versionf of puppeter-core and @sparticuz/chromium you are using? I switched to esbuild, but still at 53mb with the latest of everything - it did save 2mb. I have like 20 lines of code besides that, so its 99.9999% these two deps.

I feel like reverting to an older version will fix it…

Here is my function:

And my netlify.toml using esbuild

Still getting 53mb and this is the oldest version of @sparticuz/chromium available

I am using

"@sparticuz/chromium": "^106.0.2",
"puppeteer-core": "^18.2.1"

I also have another function running this combo

"@sparticuz/chromium": "^107.0.0",
"puppeteer-core": "^19.1.1"

You may have seen this part above but just in case:
Puppeteer ships with a specific version of chromium.
For example, as of today, the latest version of puppeteer is 18.0.5 . The latest version of chromium stated on puppeteer 's support page is 106.0.5249.0 . So you need to install @sparticuz/chromium@106 .
Source: GitHub - Sparticuz/chromium: Chromium (x86-64) for Serverless Platforms

It looks like you have a mismatched version that might cause issues with your function running but the size issue shouldn’t be caused by that. Maybe try removing esbuild and external_node_modules after upgrading your puppeter-core to a newer version.

Here is a link to newer code I wrote from the original I linked in this article. It’s using 6 packages without esbuild in netlify.toml and it hasn’t had any trouble with the size. Once I figured out the puppeter-core and sparticuz/chromium versions it was smooth sailing. =

Hope that helps, just trying to include as much info as possible.

thanks a ton, using your exact versions makes it 55mb, 53 with esbuild.

So bizarre. The only thing different is that I’m using a separate package.json for my function where you have your dependencies as part of the main project.

Going to try replicate your function entirely and see if I can get it to go under 50mb. Thanks again

Got it working, I had to set AWS_LAMBDA_JS_RUNTIME to nodejs12.x

Gonna be an issue in a few months, but it will be an issue for everyone using puppeteer in a function, so I guess Ill see you all in a few months

I am glad you got it working again. I didn’t realize this was a function that was previously working.

I originally had my scraper working with AWS_LAMBDA_JS_RUNTIME: nodejs12.x and older versions of chrome-aws-lambda but after making a single change it would fail to find chrome when it ran. That’s when I found a new solution that works but I was working with a new netlify site. I think that is the difference.
I wonder if you spin up a new netlify site with a fresh function using the versions of sparticuz/chromium and puppeteer core and see what happens.

I have done this same setup 3 times now since figuring this out and all 3 worked the first time with no size issues or chrome issues.

@sparticuz here… Unfortunantly, Chromium itself is the culprit here. It is 152 megs uncompressed, and I’m able to compress it down to 51 with brotli. If you are using a custom packaging solution, you could drop out of the bin folder to save another 3.2 megs, however, if you do that you won’t get OpenGL, which many people don’t need.

1 Like

THe day came where I couldn’t deploy my site with Node 12 functions. Simply swapping over to @Sparticuz’s @sparticuz/chromium@92.0.0 works with Node 18 and keeps it under the 50mb limit. He also made it so the library will point to external chrome downloads, so that would work if you need a more modern version of chrome

@Kidron @wesbos are you current solutions still working for you? I keep getting “ can’t be found” style errors.

I’m using Wes’ latest post packages and esbuild.

Have created a simple repo and function here to get the h1 from
GitHub - absentees/puppet-function: An attempt to launch puppeteer from a netlify function

Also interested to know how i can tell if im hitting memory limits…

Based on the past discussion, this seems to be the solution, @absentees. Have you tried that?