Initializing failed because of cached pip

Hey there,

Been using Netlify to deploy some static pages for our data team, which are generated by dbt. Now, since I’ve added a new dependency (dbt-score==0.3.0) the Initialization is failing on me, while the build step is working as expected.

I’m assuming the problem is that Netlify is using a cached pip version, more specifically a version that does not contain the (relatively new) package. Deploying through the “Clear cache and deploy site” did not solve this issue.

A potential fix could be to provide a --no-cache-dir flag for this step, but I can’t find where/how I should to do that.

Any help is appreciated! All relevant details are below.

Netlify site

ornate-valkyrie-33286f

Build settings

Logs

11:19:27 AM: build-image version: ecdc8b770f4a0193fd3f258c1bc6029e681813a4 (focal)
11:19:27 AM: buildbot version: 1bec0dae9abbe5ce035ddce1622c9a36d5384396
11:19:27 AM: Building without cache
11:19:27 AM: Starting to prepare the repo for build
11:19:28 AM: No cached dependencies found. Cloning fresh repo
11:19:28 AM: git clone --filter=blob:none git@bitbucket.org:****/data_transformations
11:19:28 AM: Preparing Git Reference refs/heads/main
11:19:30 AM: Custom build path detected. Proceeding with the specified path: ''
11:19:31 AM: Starting to install dependencies
11:19:31 AM: Python version set to 3.8
11:19:31 AM: Installing pip dependencies
11:19:31 AM: Started restoring cached pip cache
11:19:31 AM: Finished restoring cached pip cache
11:19:32 AM: Collecting agate==1.7.1
11:19:32 AM:   Downloading agate-1.7.1-py2.py3-none-any.whl (97 kB)
11:19:32 AM: Collecting asn1crypto==1.5.1
11:19:32 AM:   Downloading asn1crypto-1.5.1-py2.py3-none-any.whl (105 kB)
11:19:32 AM: Collecting attrs==23.1.0
11:19:32 AM:   Downloading attrs-23.1.0-py3-none-any.whl (61 kB)
11:19:33 AM: Collecting Babel==2.12.1
11:19:33 AM:   Downloading Babel-2.12.1-py3-none-any.whl (10.1 MB)
11:19:33 AM: Collecting beautifulsoup4==4.12.2
11:19:33 AM:   Downloading beautifulsoup4-4.12.2-py3-none-any.whl (142 kB)
11:19:34 AM: Collecting boto3==1.26.165
11:19:34 AM:   Downloading boto3-1.26.165-py3-none-any.whl (135 kB)
11:19:34 AM: Collecting botocore==1.29.165
11:19:34 AM:   Downloading botocore-1.29.165-py3-none-any.whl (11.0 MB)
11:19:35 AM: Collecting certifi==2023.7.22
11:19:35 AM:   Downloading certifi-2023.7.22-py3-none-any.whl (158 kB)
11:19:35 AM: Collecting cffi==1.15.1
11:19:35 AM:   Downloading cffi-1.15.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (442 kB)
11:19:35 AM: Collecting charset-normalizer==3.2.0
11:19:36 AM: Failed during stage 'Install dependencies': dependency_installation script returned non-zero exit code: 1
11:19:36 AM:   Downloading charset_normalizer-3.2.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (199 kB)
11:19:36 AM: Collecting click==8.1.6
11:19:36 AM:   Downloading click-8.1.6-py3-none-any.whl (97 kB)
11:19:36 AM: Collecting colorama==0.4.6
11:19:36 AM:   Downloading colorama-0.4.6-py2.py3-none-any.whl (25 kB)
11:19:36 AM: Collecting dbt-core==1.6.0
11:19:36 AM:   Downloading dbt_core-1.6.0-py3-none-any.whl (991 kB)
11:19:36 AM: Collecting dbt-extractor==0.4.1
11:19:36 AM:   Downloading dbt_extractor-0.4.1-cp36-abi3-manylinux_2_5_x86_64.manylinux1_x86_64.whl (1.1 MB)
11:19:36 AM: Collecting dbt-postgres==1.6.0
11:19:36 AM:   Downloading dbt_postgres-1.6.0-py3-none-any.whl (24 kB)
11:19:36 AM: Collecting dbt-redshift==1.6.0
11:19:36 AM:   Downloading dbt_redshift-1.6.0-py3-none-any.whl (37 kB)
11:19:36 AM: ERROR: Could not find a version that satisfies the requirement dbt-score==0.3.0 (from -r requirements.txt (line 17)) (from versions: none)
11:19:36 AM: ERROR: No matching distribution found for dbt-score==0.3.0 (from -r requirements.txt (line 17))
11:19:36 AM: Error installing pip dependencies
11:19:36 AM: Failing build: Failed to install dependencies

Based on their readme:

It appears to work only with Pyton 3.10+. Does it work with Python 3.8? If not, it won’t work on Netlify at present.

Thank you for your fast response!

That’s a bummer, then! I’ll implement a workaround for now.