So what you’re seeing is it working as designed, though I understand, not as you’d wish.
Ah, I missed that in the scripts, even though I had just looked at them I think I incorrectly assumed that behaviour was tied up with cache fetching before the scripts.
You can of course run
bundle install yourself in this case, before building, and that may work for you, depending on your site layout.
When I tried that I found I lost the the caching of Gem modules and some with platform builds are taking a very long time to install. In the end I changed tack and created a Gem with just the shared dependencies in and gave each module a simple Gemfile that depends on that.
By the way would it be possible to share the cache more between different builds? eg a preview and published build both usually use the same Gemfile dependencies so would speed builds up if unchanged.
Re: how we fetch submodules, it looks like this, and indeed, you cannot change the behavior, but if it is suboptimal, do please feel empowered to file an issue (here), about what you’d like to see work differently and explain your use case a bit, please!
git clone <project> ; cd project ; git submodule update -f --init
So I’m picking up maintaining a site and the use of git modules. I’m not convinced they are the best way of managing deps here but it is what we have. Interesting to see what git commands you use I was expecting some variant of
git clone --recurse-submodules.
Anyway, this hopefully explains our use case. But in summary, we need the latest commit of each submodule when we deploy. So I prefix all builds with a
git submodule update --init --remote before all builds (local using Netllify CLI, Netlify and GitHub workflow. I don’t need
--recurse in this case as the git submodules used in sub sections of the site is only for their preview builds. The cost of the separate iteration through all submodules with two git submodule commands is probably not worth worrying about
Thanks again and wishing all at Netlify a happy Christmas