We have a problem where the Deploy Preview of our gatsby SSR site is crashing out. The error is this:
Error: ENOENT: no such file or directory, lstat '/var/task/.cache/data'
at Object.lstatSync (node:fs:1634:25)
Static pages build out fine. For context, this is a CRA React monorepo, with one specific app in ./apps running gatsby, it’s not running anything except default plugins required, while settings in netlify were added manually & as far as i can see are correct. I initially ran a PoC on a separate, personal repo to examine if SSR worked out-of-the-box with netlify so I know it “should” work
I even tried adding mkdir -p /var/task/.cache/data to the yarn build script, to see if I could at least force-create the directory (no joy 'cos I don’t have permissions to do that in /var)
Is this directory configurable, or something 100% internal to gatsby?
I only found ONE instance of the same problem on these fora - and no resolution was apparently found? I did try to downgrade the gatsby version as suggested by the Support staff but that broke the build entirely, I think 'cos it suddenly there were conflicts with (newer) plugin versions and TBH I don’t want to clamp my version for what seems like a simple enough scenario: run gatsby from a monorepo.
I also found this thread, where it seemed like a similar problem was solved previously?
@hrishikesh Apologies for CC’ing you directly but wondering if you could help here as you directed the previous User to that supposed fix? Was it ever included in an eventual release?
Only pushing 'cos initial Discovery Phase about the viability of gatsby for our needs seems to have hit a Roadblock once we added it to a preexisting monorepo…
@hrishikesh Ok, I’ve managed to recreate on a public repo. I set it up as a workspace; so the SAME repo worked perfectly when it was just a “normal” project, so interesting that as soon as I moved the gatsby instance into a workspace app, the SSR broke…
Let me know if you need anything else; would appreciate some help here; glad I could recreate on a public repo!
(Just to test the private repo, I also removed EVERY other app from the workspace except the gatsby app, fresh yarn install etc. and same result … so again, seems to point towards a flaw with workspaces)
This is a hard blocker for a critical project and if workspaces are a problem with Gatsby, would appreciate either a fix of confirmation. It needs to run in a workspace environment.
Thanks
I think I have more detail as well: just to elaborate on the problem and help anyone in Netlify / Gatsby reading this, I think workspaces are the problem - and the paths used to access data during Function startup / execution. Hopefully this helps clear up what I believe is an internal problem re. Gatsby & workspace apps.
Given gatsby is now also netlify I’m HOPING there’s some coordination here available to resolve this. I can’t be the first person trying to run SSR from a yarn workspace app.
I tried yarn build to see what was generated locally in .cache. I can see .cache/page-ssr/... is generated fine; I can also see .cache/data/ is also present. So locally, the build seems OK, everything’s where I’d expect it.
But then I’m looking at the stacktrace of the SSR error from the Deployed Site I mentioned above:
An unhandled error in the function code triggered the following message:
Error - ENOENT: no such file or directory, lstat '/var/task/.cache/data'
Stack trace
Error: ENOENT: no such file or directory, lstat '/var/task/.cache/data'
at Object.lstatSync (node:fs:1666:3)
at Object.lstatSync (/var/task/node_modules/graceful-fs/polyfills.js:318:34)
at statFunc (/var/task/node_modules/fs-extra/lib/util/stat.js:24:20)
at getStatsSync (/var/task/node_modules/fs-extra/lib/util/stat.js:25:19)
at Object.checkPathsSync (/var/task/node_modules/fs-extra/lib/util/stat.js:64:33)
at Object.copySync (/var/task/node_modules/fs-extra/lib/copy/copy-sync.js:27:38)
at setupFsWrapper (/var/task/apps/gatsby-test/.cache/page-ssr/lambda.js:153:10)
at Object.<anonymous> (/var/task/apps/gatsby-test/.cache/page-ssr/lambda.js:172:16)
at Module._compile (node:internal/modules/cjs/loader:1364:14)
at Module._extensions..js (node:internal/modules/cjs/loader:1422:10)
Look at the paths: setupFsWrapper() is invoked from /var/task/apps/gatsby-test/.cache/page-ssr/lambda.js … so the local Workspace app’s build path. That’s fine AFAIK and correct.
However, look at the error: it’s trying to access /var/task/.cache/data when I think the problem is it SHOULD be trying to access /var/task/apps/gatsby-test/.cache/data. No wonder it’s crashing, seems the relative paths aren’t entirely correct in all cases…
Thanks for following up and apologies again for the issues here! It looks like the above repository is no longer publicly available; can you advise if the permissions have changed since you shared it originally? Looking forward to getting to the bottom of this one.
We are also running into a similar issue where we are moving specific branches as production site instances. Despite SSR continuing to work as a branch deploy on previous site instance, our new site where the branch deploy is now production is failing with the errors above.
Our site doesn’t leverage workspaces, but our gatsby directory is one directory deep with netlify.toml and package.json.
hey Andrew thanks for checking! I was, and was also able to replicate this on a yarn workspace of my own. we’ve since escalated this to our Frameworks team for them to take a closer look, and we’ll follow up again here as soon as we have more insight!
I’ve run into exactly the same issue trying to migrate SSR functions in a monorepo that uses Yarn workspaces. It looks like this is our last hurdle in the marathon migration from Gatsby Cloud, happy to provide more info required.
Edit: for anyone running into this issue specifically with Yarn workspaces/monorepos, it looks like this PR adds a fix.
I wouldn’t use it in production, but testing with the following canary releases has been succesful for me so far:
@SeanMcleod We actually moved to NextJS 'cos there was time pressure for the work & we weren’t in a position to wait for any response or fix from gatsby; great to see it is being addressed and TBH I preferred gatsby’s stack, but unless something blows up with our NextJS implementation we won’t pivot back (again) to gatsby