Docker compose for production and development
Asked Answered
S

3

21

So I use Python+Django (but it does not really matter for this question)

When I write my code I simply run

./manage.py runserver 

which does the webserver, static files, automatic reload, etc.

and and to put it on production I use series of commands like

./manage.py collectstatic
./manage.py migrate
uwsgi --http 127.0.0.1:8000 -w wsgi --processes=4

also I have few other services like postgres, redis (which are common for both production and dev)

So I'm trying to adapt here docker(+ -compose) and I cannot understand how to split prod/dev with it.

basically in docker-compose.yml you define your services and images - but in my case image in production should run one CMD and in dev another..

what are the best practices to achieve that ?

Sukkoth answered 31/10, 2017 at 15:53 Comment(0)
J
34

You should create additional docker-compose.yml files like docker-compose-dev.yml or docker-compose-pro.yml and override some of the original docker-compose.yml configuration with -f command:

docker-compose -f docker-compose.yml -f docker-compose-dev.yml up -d

Sometimes, I also use different Dockerfile for different environments and specify dockerfile parameter in docker-compose-pro.yml build section, but I didn't recommend it because you will end with duplicated Dockerfiles.

Update

Docker has introduced multi-stage builds feature https://docs.docker.com/develop/develop-images/multistage-build/#use-multi-stage-builds which allow to create a Dockerfile for different environments.

Justin answered 31/10, 2017 at 16:3 Comment(0)
B
3

Docker Compose can build out a production environment or development environment depending on what the .env file says.

The build: directive for each service in your compose.yaml file allows you to specify the Dockerfile used to build out the service. Use dockerfile:${nginx_Prod_Dockerfile}, for example. You will need two Dockerfile's for each service - one dev, the other prod.

This allows you to utilize the same compose.yaml to build both environments depending on what your .env file says.

See the documentation

CLI

Substitute with --env-file

You can set default values for multiple environment variables, in an environment file and then pass the file as an argument in the CLI.

The advantage of this method is that you can store the file anywhere and name it appropriately, for example, .env.ci, .env.dev, .env.prod. This file path is relative to the current working directory where the Docker Compose command is executed. Passing the file path is done using the --env-file option:

docker compose --env-file ./config/.env.dev up

Beamy answered 11/5, 2023 at 19:17 Comment(0)
S
-6

Usually having a different production and dev starting workflow is a bad idea. You should always try to keep both dev and prod environments very similar, even in the way you launch your applications. You should always externalize the configuration that is different between the different environments.

Having different startup sequence is maybe acceptable, however having multiple docker images (or dockerfiles) for each environment is a very bad idea. Docker images should be immutable and portable.

However, you might have some constraints. Docker-compose allows you to override the command that is specified in the image. There is the command property that will override the default command in the image. I would recommend that you keep the image production ready, i.e. use something like CMD ./manage.py collectstatic && ./manage.py migrate && uwsgi --http 127.0.0.1:8000 -w wsgi --processes=4 in the Dockerfile.

In the compose file just override the CMD by specifying:

command: ./manage.py runserver 

Having multiple compose file is usually not a big issue. You can keep your compose files clean and manageable by using some nice compose file features such as extends, where once compose file can extend another one.

Studley answered 31/10, 2017 at 17:6 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.