How do I deploy RQ / Celery workers on GCP?
Asked Answered
S

1

9

I've got a Flask app that's deployed (using a Dockerfile) to Google Cloud Run. The app's structure closely resembles the Flask Mega Tutorial. It uses a Postgres database that runs on Cloud SQL.

The app needs to process background tasks. It seems like Celery or Redis Queue are the most common ways to go. I don't want to use Cloud Tasks because it breaks the dev/prod parity rule in the 12-factor app paradigm.

Redis Queue was simple to get up and running on my local machine, but I can't find a best-practices guide anywhere on how to use Redis Queue with a Flask app running on Cloud Run.

I decided to use Google's Memorystore for my Redis instance, but now I'm not sure what the best way to run my Redis workers is. I'd like for these workers to scale up as more tasks are added to the Redis Queue by my Flask server (the way Cloud Run scales up instances when more and more HTTP requests are made). Right now, I'm considering deploying a worker (a copy of my Flask app with the task functions) to App Engine, but that doesn't seem like quite the right solution.

What do people recommend for deploying RQ / celery workers? I'm happy to alter my deployment strategy (and platform) entirely to achieve a simple, scalable architecture that can be easily reproduced in a local dev setup.

Sleazy answered 10/5, 2021 at 0:33 Comment(2)
Did you manage to come to a good solution for this problem?Scapegrace
You can use github.com/aertje/cloud-tasks-emulator to keep dev/prod env in syncScutari
T
1

I have achieved this for redis queue with Flask app on GCP. To achieve this you have to follow few steps:

  1. Update docker file in a way that you can run multiple CMD commands in single container. One for running Flask (backend) and other for worker process (like rq worker /python worker.py). You should place both commands in run.sh file and specify chmod command in docker to make the file executable. And then in CMD line in docker place the name of the file 'CMD ["./run.sh"]'.
  2. Allow CPU allocation to always option on GCP project settings. You can go to edit build or deployment option in cloud run. By default cpu gets deallocated when response is sent back from backend (Flask) and redis queue operations do not work properly in background mostly when they need internet or require any Database connection.
  3. If redis is unable to be used by putting into requirements.txt file then you can also specify using "apt-get install your-worker-package" in docker file.
Theocracy answered 21/9, 2023 at 22:7 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.