- I am using Google Cloud Function using Python. Several other functions are in production.
- However, for this, I have additionally created a custom Python package that is available on github as a private repo.
- I need to install the package in the Google Function
WHAT I HAVE DONE
- I run the Google Function in local using
functions-framework
- I have a requirements.txt which has a link to the package. This is done by adding the following line to
requirements.txt
:
-e git+https://github.com/<repo>/<project>#egg=<project>
- I run
pip install -r requirements.txt
. And the package is successfully installed. - Now in the python code of the function using
import <pkg-name>
works and I am able to access all the functions.
CHALLENGES WHEN PUSHING THE FUNCTION TO THE CLOUD
- As per the documentation, to push the Cloud function to Google Cloud, I issue the command:
gcloud functions \
deploy <function-name> \
--trigger-http \
--entry-point <entry-function> \
--runtime python37 \
--project=<my-project>
As expected this gives an error because it does not have access to the private repo in git.
I created a Google Cloud Repository and linked it to the git repo, hoping that in some way I could specify that in the requirements.txt. Just do not know how.
I tried setting environment variables for username and password (not a good idea, I agree) in Google Cloud Function and specify them in the
requirements.txt
as:
-e git+https://${AUTH_USER}:${AUTH_PASSWORD}@github.com/<repo>/<project>#egg=<project>
That too gave an error.
Any help or direction will be greatly appreciated.