How can a Google Cloud Python function access a private Python package
Asked Answered
B

3

11
  1. I am using Google Cloud Function using Python. Several other functions are in production.
  2. However, for this, I have additionally created a custom Python package that is available on github as a private repo.
  3. I need to install the package in the Google Function

WHAT I HAVE DONE

  1. I run the Google Function in local using functions-framework
  2. I have a requirements.txt which has a link to the package. This is done by adding the following line to requirements.txt:
    -e git+https://github.com/<repo>/<project>#egg=<project>
  1. I run pip install -r requirements.txt. And the package is successfully installed.
  2. Now in the python code of the function using import <pkg-name> works and I am able to access all the functions.

CHALLENGES WHEN PUSHING THE FUNCTION TO THE CLOUD

  1. As per the documentation, to push the Cloud function to Google Cloud, I issue the command:
gcloud functions \
  deploy <function-name> \
  --trigger-http  \
  --entry-point <entry-function> \
  --runtime python37 \
  --project=<my-project>

As expected this gives an error because it does not have access to the private repo in git.

  1. I created a Google Cloud Repository and linked it to the git repo, hoping that in some way I could specify that in the requirements.txt. Just do not know how.

  2. I tried setting environment variables for username and password (not a good idea, I agree) in Google Cloud Function and specify them in the requirements.txt as:

    -e git+https://${AUTH_USER}:${AUTH_PASSWORD}@github.com/<repo>/<project>#egg=<project>

That too gave an error.

Any help or direction will be greatly appreciated.

Bar answered 26/3, 2020 at 13:44 Comment(0)
P
4

You can not access the private repo from cloud function. According to the official documentation:

" Using private dependencies

Dependencies are installed in a Cloud Build environment that does not provide access to SSH keys. Packages hosted in repositories that require SSH-based authentication must be vendored and uploaded alongside your project's code, as described in the previous section.

You can use the pip install command with the -t DIRECTORY flag to copy private dependencies into a local directory before deploying your app, as follows:

Copy your dependency into a local directory:

pip install -t DIRECTORY DEPENDENCY

Add an empty init.py file to the DIRECTORY directory to turn it into a module.

Import from this module to use your dependency:

import DIRECTORY.DEPENDENCY

"

Specifying dependencies in Python

Pembroke answered 26/3, 2020 at 13:59 Comment(3)
Thanks @marian.vladoi. I did not know that. Ah that is so unfortunate. Let me check the documentation again.Bar
after doing the pip install -t <DIRECTORY> <DEPENDENCY> I am failing at importing because my <DEPENDENCY> has dependencies of its own that are loaded into that directory but when doing the subsequent import <DIRECTORY>.<DEPENDENCY> it looks for them and (even though they are in the directory along with my primary dependency I desire) fails. Does GCF autoload this into its path? Should I be setting my path somewhere?Doornail
However if you are using artifact registry to host your private python packages you can. The documentation in the above answer shows how.Elstan
B
10

While @marian.viadoi was correct in the above comment viz. Google Cloud Function cannot access a private git repo, I have implemented a workaround and sharing just in case it is suitable. Here is what has been done:

  1. The Python package was prepared as per the documentation in (https://packaging.python.org/tutorials/packaging-projects/)
  2. A binary distribution "whl" file was created using python3 setup.py bdist_wheel
  3. This whl file was put in the google functions folder. I chose to put it under a dist folder of it.
  4. In the requirements.txt I added the line: ./dist/xxxx.whl to other dependencies required by the cloud function
  5. Pushed the cloud function to GCP via the gcloud deploy ..
  6. On successful installation, the package and its dependencies get automatically installed in the virtual environment of the cloud function

This does not answer how to use a private repo in Google Functions (which is not possible), but the above steps ensure that a private package can be easily shared and updated amongst different cloud functions.

Bar answered 28/3, 2020 at 4:31 Comment(0)
P
4

You can not access the private repo from cloud function. According to the official documentation:

" Using private dependencies

Dependencies are installed in a Cloud Build environment that does not provide access to SSH keys. Packages hosted in repositories that require SSH-based authentication must be vendored and uploaded alongside your project's code, as described in the previous section.

You can use the pip install command with the -t DIRECTORY flag to copy private dependencies into a local directory before deploying your app, as follows:

Copy your dependency into a local directory:

pip install -t DIRECTORY DEPENDENCY

Add an empty init.py file to the DIRECTORY directory to turn it into a module.

Import from this module to use your dependency:

import DIRECTORY.DEPENDENCY

"

Specifying dependencies in Python

Pembroke answered 26/3, 2020 at 13:59 Comment(3)
Thanks @marian.vladoi. I did not know that. Ah that is so unfortunate. Let me check the documentation again.Bar
after doing the pip install -t <DIRECTORY> <DEPENDENCY> I am failing at importing because my <DEPENDENCY> has dependencies of its own that are loaded into that directory but when doing the subsequent import <DIRECTORY>.<DEPENDENCY> it looks for them and (even though they are in the directory along with my primary dependency I desire) fails. Does GCF autoload this into its path? Should I be setting my path somewhere?Doornail
However if you are using artifact registry to host your private python packages you can. The documentation in the above answer shows how.Elstan
N
3

Your Method 3 (Git install with username and password) does work. But do remember to add them as Build environment variables rather than Run environment variables, as pip install happens during the build. If you have concerns using your personal credentials, you can get a deployment token instead (which is similar to personal credentials, with a username and password).

Nomography answered 15/2, 2022 at 8:25 Comment(1)
THIS I've been trying all day to use Secrets to obscure the password to a private pypi, but it wasn't working. We finally figured out that it looked like the env var just wasn't available yet. It turns out Secrets are only available for run time, so for env vars needed in requirements.txt, we have to use the slightly-less-secret, but available-earlier build variables.Frias

© 2022 - 2024 — McMap. All rights reserved.