GCP - Cloud Function can't find Python package from Artifact Registry in the same project
Asked Answered
R

3

6

I've been trying GCP's Artifact Registry, which is currently in alpha for Python packages.

I do the authentication via Keyring along with my service account, as explained in the documentation.

I can successfully upload a package using Twine, and I can successfully download it to a local Python project by installing the following requirements.txt:

--extra-index-url https://my-region-pypi.pkg.dev/my-project/my-python-repo/simple/
my-package

However, when I deploy a minimal Cloud Function to the same project as my Artifact Registry, with the same requirements.txt shown above, the deployment fails with the following output:

ERROR: (gcloud.functions.deploy) OperationError: code=3, message=Build failed: `pip_download_wheels` had stderr output:
ERROR: Could not find a version that satisfies the requirement my-package (from -r requirements.txt (line 2)) (from versions: none)
ERROR: No matching distribution found for my-package (from -r requirements.txt (line 2))

I tried with both --extra-index-url and just plain --index-url, with no difference. I also tried installing the keyring dependencies with the following requirements.txt:

--extra-index-url https://my-region-pypi.pkg.dev/my-project/my-python-repo/simple/
keyring
keyrings.google-artifactregistry-auth
my-module

But I get the same error.

I checked the permissions for my default App Engine service account for my project, which is also used for Cloud Functions, and I can confirm that it has the Artifact Registry Reader role, so it doesn't seem to be a permissions issue.

I also tried deploying a minimal App Engine service instead of a Cloud Function, but I get the same error.

Many thanks for the help.

Reptant answered 11/5, 2021 at 16:2 Comment(8)
No luck, unfortunately. In the meantime we stopped using Artifact Registry and started including our compressed packages manually in our repos. It works, but it can be a little tedious.Reptant
Has there been any update on this? I have a similar problem where I used AR to install a package (visible in cloud build logs), but its not available in the environment (doing help('modules') doesn't list my package either).Romalda
Same issue. But I won't go for the dirty compressed package solution, my workaround will be not using Google Cloud I guess.Jupiter
Has anyone tried with the cloud functions v2 ?Phenazine
Tried on the --gen2, same problemPhenazine
I narrowed down the problem, I thik this is a package versioning issue from the artifact registry, I created a separate issue: #72156498Phenazine
I'm curious to know what happens if you use a SA JSON file to do the auth, if you run this command: gcloud artifacts print-settings python --json-key="somekey.json" --repository=REPO --location=LOCATION pointing somekey.json to a serviceaccount JSON file with the right permissions for the repo. It will generate an extra-index-url with the auth embedded. Do not put this in a public place as it will contain the JSON SA creds.Manis
just as an FYI, I tried this on my test-project, and I get a 403 error when it tries to download the package, but the logs still show that it did find the package as I can see the exact URL for the .whl file, including version, but the 403 is thrown on the download. Note that the SA I used lives in the same project as the repo, with Artifact Registry Reader permissions.Manis
M
8

Took me a while, but I managed to get a CF from one project to download a package from another project.

There's a couple of steps involved, one of which is as of now not documented. Doing some testing and looking at the logs helped me narrow the actual behavior down.

1: Have package in one project. I'll call that project repo-project.
Note, the package I uploaded is a simple one that just returns 42 when the only function inside is called. Any helloworld package should suffice.

2: Have another project for the Cloud Function. I'll call that project cf-project.

3: Create service account in either project, and give it the Artifact Registry Reader permission in repo-project. I'll call this artifact-sa.

4: This is the undocumented step: Give the Cloud Build Service Account from cf-project the same Artifact Registry Reader permission in repo-project. The format for the name of this account is <PROJECT-NUMBER>@cloudbuild.gserviceaccount.com

5: Not sure if this one is needed, but it's how I did it. I ran the command below pointing to a downloaded JSON version of artfact-sa:
gcloud artifacts print-settings python --json-key="artifact-sa.json" --repository=REPO --location=LOCATION
This prints out a value for --extra-index to put in a requirements.txt which includes the JSON key. I think using the keyring method mentioned by OP would also work here.
(note I did some extra steps which are not needed, see below, to ensure the key doesn't get uploaded to any github repo attached to the code for the CF)

6: Deploy the code however you like.


Summary

So, to summarize, the first authentication to the repo is done with whatever SA you use (eg, in the keyring, or using the method I described above). Stupidly enough, the download itself is done with the inbuilt SA for Cloud Build from the project you are deploying the Cloud Function to (cf-project). IMHO this should be done by the same SA as the first.

As to how I found out the SA for Cloud Build was the issue, when I only added artifact-sa to repo-project, it did find the exact .whl file while deploying the CF in cf-project, with the correct version number (checking the error in the logs). It tried to download the package, but it got a 403 on said download.
I've had other scenarios where the internal usage of SA's on Google's side was a bit wonky, and the Cloud Build one is definitely a repeat offender here.


Extra steps for security

I created a secondary requirements file and added that to my .gitignore file to make sure it doesn't get uploaded to my repo, because uploading keys is a bad idea

requirements.txt

-r privatereq.txt
mypythonpackage

private.req.txt

--extra-index-url https://_json_key_base64:[BASE64_KEY_SNIPPED]@[LOCATION]-python.pkg.dev/[REPO-PROJECT]/python-repo/simple/

.gitignore

*/privatereq.txt
Manis answered 9/5, 2022 at 19:33 Comment(3)
step #4 is what solved the issue for me.Trickery
Step 4 worked for me too after 2.5 hours of lost time. Where does it come from and why isn't it documented? Thanks!Secular
oversight from Google, they only tested the happy path. Real world is sadly not only happy paths.Manis
Z
0

pointing the requirements to:

--extra-index-url https://<location>-python.pkg.dev/<project>/<repository>/simple
<package>

and imports in main:

from <package> import <module>

works for me. Remember to repeat the required modules for your package in the requirements.txt (setup.cfg from packaging works only in the build-process)

Zaid answered 4/5, 2022 at 15:20 Comment(1)
strange, is your package in the same project as the cloud function ? Can you show the detail of the package ?Phenazine
P
0

This was definitely the missing piece, I thought I had to give the default service account permission on the Artifact Registry. Changing it to Cloud Build instead got me there. Didn't have to do anything other than that.

Pelaga answered 23/2, 2023 at 14:23 Comment(0)

© 2022 - 2025 — McMap. All rights reserved.