Can you run Celery in a different container from Django?
Asked Answered
A

2

12

From my reading today, in all the examples I found I didn't see any where celery is in a completely separate container from Django itself. It seems as though Celery has to be in the same container since it walks the apps source files and looks for tasks.py as well as the initial celery.py Is that correct or did I misread today?

For example. I am familiar with using docker-compose to spin up Django, Nginx, Postgres and a storage container. I assumed I'b be adding a celery and rabbitmq container, but I see no way to configure Django to use a remote Celery server.

I'm still early in my understanding of Celery, I hope this isn't something I overlooked elsewhere.

Thanks,
-p

Alack answered 5/4, 2016 at 23:47 Comment(4)
There are several examples on the web if you search for docker-compose django celery: syncano.io/blog/… but maybe I am not understanding the questionBellyful
I've read that article. It is quite old (refers to fig and south) and it does not speak to whether Celery has to run in the same container. But thank you for the reply.Alack
It's basically the same. You do have to run the same code so I guess, yes, you use the same image, but then use the command and environment statements on your docker-compose to run one as regular django and one as celeryBellyful
You know I had never really thought about it that way before. You can install multiple services into an image, but run them in separate containers. Of course that seems obvious now. ;-) I'll give that a shot and follow-up on the answer later.Alack
D
4

By default, that's what happen if you use Heroku, it run a web Dyno for django to respond to requests, and an other worker Dyno for Celery, each Dyno run on a separate instance.

Both Dynos run the same code, your celery worker need to access the models, and it's easy to manage/deploy one code base, but there is nothing stopping you from using different code base for each instance, as the communication between Django and Celery is done with AMQP protocol throw a Broker like Reddis.

Deafening answered 7/4, 2016 at 0:47 Comment(2)
How is the django codebase provided to each celery container? Does the django code need to be stored inside each celery container?Willis
@HelgeSchneider Normally, you have to deploy the exact same app, the only difference is the start command, ex: for django it's gunicorn app.wsgi for celery celery worker -A app, if you run django command in celery container it should work, both environments are similarDeafening
N
1

It is simpler to run all processes in the single container. However, you can run each process in a separate container. They will communicate with each other through broker and results backend.

Here is my example repository that is using Django, Celery, RabbitMQ and Redis. Additionally, Django serves REST API (wsgi server for DRF) and WebSockets (asgi server for Django Channels). Every process is running in separate container but they do have common codebase (all stored in single repository).

version: '2'

services:
    nginx:
        restart: always
        image: nginx:1.12-alpine
        ports:
            - 8000:8000
        volumes:
            - ./docker/nginx/default.conf:/etc/nginx/conf.d/default.conf
            - static_volume:/app/backend/server/django_static
    client:
        build:
            context: .
            dockerfile: ./docker/client/Dockerfile
        restart: always
        ports:
            - 5001:5000
        expose:
            - 5000
    postgres:
        restart: always
        image: postgres:9.5.6-alpine
        volumes:
          - ./docker/postgres/data:/var/lib/postgresql
        ports:
            - 5433:5432
        expose:
            - 5432
        environment:
            FILLA_DB_USER: tasks_user
            FILLA_DB_PASSWORD: tasks_password
            FILLA_DB_DATABASE: simple_tasks
            POSTGRES_USER: postgres
        volumes:
            - ./docker/postgres/init-user-db.sh:/docker-entrypoint-initdb.d/init-user-db.sh
    redis:
        image: redis:3.0-alpine
        restart: unless-stopped
        ports:
            - 6378:6379
    rabbitmq:
        image: rabbitmq:3.7-alpine
        restart: unless-stopped
    wsgiserver:
        extends:
            file: docker-common.yml
            service: backend
        entrypoint: /app/docker/backend/wsgi-entrypoint.sh
        volumes:
            - static_volume:/app/backend/server/django_static
        links:
            - postgres
            - redis
            - rabbitmq
        expose:
            - 8000
    asgiserver:
        extends:
            file: docker-common.yml
            service: backend
        entrypoint: /app/docker/backend/asgi-entrypoint.sh
        links:
            - postgres
            - redis
            - rabbitmq
        expose:
            - 9000
    worker:
        extends:
            file: docker-common.yml
            service: backend
        entrypoint: /app/docker/backend/worker-entrypoint.sh
        links:
            - postgres
            - redis
            - rabbitmq
    redislistener:
        extends:
            file: docker-common.yml
            service: backend
        entrypoint: /app/docker/backend/redis-listener-entrypoint.sh
        links:
            - postgres
            - redis
            - rabbitmq
    workerlistener:
        extends:
            file: docker-common.yml
            service: backend
        entrypoint: /app/docker/backend/worker-listener-entrypoint.sh
        links:
            - postgres
            - redis
            - rabbitmq
volumes:
    static_volume: {}

For small projects I would go with keeping all processes in the single container and for larger projects that need scaling I would go with multiple containers.

Nitrile answered 30/9, 2022 at 13:24 Comment(0)

© 2022 - 2025 — McMap. All rights reserved.