How to manage local vs production settings in Django?
Asked Answered
M

23

349

What is the recommended way of handling settings for local development and the production server? Some of them (like constants, etc) can be changed/accessed in both, but some of them (like paths to static files) need to remain different, and hence should not be overwritten every time the new code is deployed.

Currently, I am adding all constants to settings.py. But every time I change some constant locally, I have to copy it to the production server and edit the file for production specific changes... :(

Edit: looks like there is no standard answer to this question, I've accepted the most popular method.

Machado answered 26/10, 2009 at 18:0 Comment(5)
See #88759Concubinage
Please have a look at django-configurations.Cattan
The accepted method is no longer the most popular one.Nightjar
django-split-settings is very easy to use. It does not require to rewrite any default settings.Cumulostratus
yo should use base.py file and in your local.py "from .base import *", the same in your production.py "from .base import *", you need run your project with: python manage.py runserver --settings=project_name.settings.localDorison
S
136

In settings.py:

try:
    from local_settings import *
except ImportError as e:
    pass

You can override what needed in local_settings.py; it should stay out of your version control then. But since you mention copying I'm guessing you use none ;)

Sedum answered 27/10, 2009 at 10:1 Comment(15)
To ease tracking/deployment of new settings, use a "local_settings.py" on the production/testing machines and none on development.Atomic
That's the way I do - adding those lines at the end of settings.py so they can override the default settingsStandardize
Cleanest way, especially if you're using version control.Hartman
This approach means you have unversioned code running in development and production. And every developer has a different code base.I call anti-pattern here.Glarus
@Glarus The problem is that Django stores it's configuration in .py file. You can't expect that all developers and production server will use the same settings, so you need to alter this .py file or implement some alternative solution (.ini files, environment etc.).Comestible
I prefer calling the module settings_local as opposed to local_settings to group it with settings.py in alphabetical folder listings. Keep settings_local.py out of version control using .gitignore as credentials don't belong to Git. Imagine open sourcing them by accident. I keep in git a template file called settings_local.py.txt instead.Guidotti
@Comestible can't you expect that? In the age of virtualization and containers I think you can and should strive for that. Maybe not always possible, but you can selectively address individual issues by defining environment variables to allow overrides as necessary. Sure beats the "Works for Me"-shrug when the junior asks the senior why something isn't working.Deangelis
Hi, can I ask someone why except ImportError as e: ?? isn't except ImportError: just enough?Corfu
How does Django know what is local and what it production? Is there a default that states that whatever is in local_settings.py belongs to the local environment? Also, will having this file in production have an effect on the performance and cause unintended behavior?Chromatolysis
@Chromatolysis local_settings.py is out of your version control, you don't push that file to the production server. This file is available only in your local development environment. If this file gets to the production server, it will certainly cause an unintended behaviour.Pulp
Definitely an antipattern, but very popular. Developers should know what settings are used in production envrionment. Everything should be versioned except secrets - those should be pulled from environment variables. Deployment tools like Ansible can set it.Henghold
If you want to modify not override variables like lists (e.g. INSTALLED_APPS), you can import them in the local_settings.py like: from .settings import INSTALLED_APPS and then in local_settings.py: INSTALLED_APPS += ['autotranslate']Cestar
@Glarus I'd like to hear what your better solution to this problem is. Mind sharing it?Whoops
BTW, what I'm currently doing is relying on environment variables and having the project.settings module rely on pulling them from os.environ.Whoops
Do settings in local_settings automatically override those in settings.py?Methodize
A
329

Two Scoops of Django: Best Practices for Django 1.5 suggests using version control for your settings files and storing the files in a separate directory:

project/
    app1/
    app2/
    project/
        __init__.py
        settings/
            __init__.py
            base.py
            local.py
            production.py
    manage.py

The base.py file contains common settings (such as MEDIA_ROOT or ADMIN), while local.py and production.py have site-specific settings:

In the base file settings/base.py:

INSTALLED_APPS = (
    # common apps...
)

In the local development settings file settings/local.py:

from project.settings.base import *

DEBUG = True
INSTALLED_APPS += (
    'debug_toolbar', # and other apps for local development
)

In the file production settings file settings/production.py:

from project.settings.base import *

DEBUG = False
INSTALLED_APPS += (
    # other apps for production site
)

Then when you run django, you add the --settings option:

# Running django for local development
$ ./manage.py runserver 0:8000 --settings=project.settings.local

# Running django shell on the production site
$ ./manage.py shell --settings=project.settings.production

The authors of the book have also put up a sample project layout template on Github.

Arose answered 10/3, 2013 at 18:34 Comment(15)
Note that instead of using --settings every time, you could set the DJANGO_SETTINGS_MODULE envvar. This works nicely with, eg, Heroku: set it globally to production, then override it with dev in your .env file.Feverroot
Using DJANGO_SETTINGS_MODULE env var is the best idea here, thanks Simon.Gaberones
@Gaberones wanna edit my answer or should I to add the env-var stuff?Arose
I was getting an ImportError: Could not import settings 'project.settings.local' (Is it on sys.path?): No module named settings.local. Didn't realize you needed __init__.py in directory in order for import to work.Votyak
@Votyak you scared me for a second, I thought I forgot to put the __init__.py file in the directory listing above!Arose
@pydanny comments in January that this is an anti-pattern and then in March the book pydanny co-authored that calls this an anti-pattern is brought up completely separately as evidence of its anti-pattern-ocity. That's funny :D. Great book by the way!Longdrawn
this is good 1) I can test against prd settings in my local 2) I can share variables in prd/dev settings 3)complete base & prd setting under version controlBlueweed
You may need to change BASE_DIR settings to os.path.dirname(os.path.realpath(os.path.dirname(__file__) + "/.."))Shantung
@ohouse, I've see a lot of solutions like this (dev and prod settings are separate files that import from a base settings.py file). However, the rest of my project imports values from settings.py. How do you go about solving that?Righthand
@Righthand according to the django docs, you import from django.conf import settings which is an object that abstracts the interface and decouples the code from the location of the settings, docs.djangoproject.com/en/dev/topics/settings/…Arose
@PetrPeller - this is important - I'm struggling with my code no longer finding my modules after implementing this! (But only in Waitress, which is weird - runserver works fine.)Geisel
@SimonWeber Good idea. Also, on your dev machine, if you put alias django-settings="path=\$(pwd);dir=\$(basename \$path); export DJANGO_SETTINGS_MODULE=\$dir\".settings.dev\"" in your .bashrc (or similar), you can just type django-settings in the root dir of any django project that uses this pattern. @rsp, this will help you too I guess, just don't type django-settings for those other projects.Translative
If I set the DJANGO_SETTINGS_MODULE through an environmental variable, do I still need os.environ.setdefault("DJANGO_SETTINGS_MODULE", "projectname.settings.production") in my wsgi.py file? Also, I've set the environmental var using: export DJANGO_SETTINGS_MODULE=projectname.settings.local, but then it is lost when I close the terminal. What can I do to ensure it is saved? Should I add that line to the bashrc file?Medallion
@PetrPeller this is cleaner: BASE_DIR = os.path.dirname(os.path.dirname(os.path.dirname(os.path.abspath(__file__))))Proconsul
In Django 4.1, which I don't think the version matters, I was able to accomplish a cleaner BASE_DIR with BASE_DIR = Path(__file__).resolve().parent.parent.parent. This requires importing from pathlib import Path.Barraza
S
136

In settings.py:

try:
    from local_settings import *
except ImportError as e:
    pass

You can override what needed in local_settings.py; it should stay out of your version control then. But since you mention copying I'm guessing you use none ;)

Sedum answered 27/10, 2009 at 10:1 Comment(15)
To ease tracking/deployment of new settings, use a "local_settings.py" on the production/testing machines and none on development.Atomic
That's the way I do - adding those lines at the end of settings.py so they can override the default settingsStandardize
Cleanest way, especially if you're using version control.Hartman
This approach means you have unversioned code running in development and production. And every developer has a different code base.I call anti-pattern here.Glarus
@Glarus The problem is that Django stores it's configuration in .py file. You can't expect that all developers and production server will use the same settings, so you need to alter this .py file or implement some alternative solution (.ini files, environment etc.).Comestible
I prefer calling the module settings_local as opposed to local_settings to group it with settings.py in alphabetical folder listings. Keep settings_local.py out of version control using .gitignore as credentials don't belong to Git. Imagine open sourcing them by accident. I keep in git a template file called settings_local.py.txt instead.Guidotti
@Comestible can't you expect that? In the age of virtualization and containers I think you can and should strive for that. Maybe not always possible, but you can selectively address individual issues by defining environment variables to allow overrides as necessary. Sure beats the "Works for Me"-shrug when the junior asks the senior why something isn't working.Deangelis
Hi, can I ask someone why except ImportError as e: ?? isn't except ImportError: just enough?Corfu
How does Django know what is local and what it production? Is there a default that states that whatever is in local_settings.py belongs to the local environment? Also, will having this file in production have an effect on the performance and cause unintended behavior?Chromatolysis
@Chromatolysis local_settings.py is out of your version control, you don't push that file to the production server. This file is available only in your local development environment. If this file gets to the production server, it will certainly cause an unintended behaviour.Pulp
Definitely an antipattern, but very popular. Developers should know what settings are used in production envrionment. Everything should be versioned except secrets - those should be pulled from environment variables. Deployment tools like Ansible can set it.Henghold
If you want to modify not override variables like lists (e.g. INSTALLED_APPS), you can import them in the local_settings.py like: from .settings import INSTALLED_APPS and then in local_settings.py: INSTALLED_APPS += ['autotranslate']Cestar
@Glarus I'd like to hear what your better solution to this problem is. Mind sharing it?Whoops
BTW, what I'm currently doing is relying on environment variables and having the project.settings module rely on pulling them from os.environ.Whoops
Do settings in local_settings automatically override those in settings.py?Methodize
L
76

Instead of settings.py, use this layout:

.
└── settings/
    ├── __init__.py  <= not versioned
    ├── common.py
    ├── dev.py
    └── prod.py

common.py is where most of your configuration lives.

prod.py imports everything from common, and overrides whatever it needs to override:

from __future__ import absolute_import # optional, but I like it
from .common import *

# Production overrides
DEBUG = False
#...

Similarly, dev.py imports everything from common.py and overrides whatever it needs to override.

Finally, __init__.py is where you decide which settings to load, and it's also where you store secrets (therefore this file should not be versioned):

from __future__ import absolute_import
from .prod import *  # or .dev if you want dev

##### DJANGO SECRETS
SECRET_KEY = '(3gd6shenud@&57...'
DATABASES['default']['PASSWORD'] = 'f9kGH...'

##### OTHER SECRETS
AWS_SECRET_ACCESS_KEY = "h50fH..."

What I like about this solution is:

  1. Everything is in your versioning system, except secrets
  2. Most configuration is in one place: common.py.
  3. Prod-specific things go in prod.py, dev-specific things go in dev.py. It's simple.
  4. You can override stuff from common.py in prod.py or dev.py, and you can override anything in __init__.py.
  5. It's straightforward python. No re-import hacks.
Langobard answered 9/3, 2013 at 19:48 Comment(4)
I'm still trying to figure out what to set in my project.wsgi and manage.py files for the settings file. Will you shed some light on this? Specifically, in my manage.py file I have os.environ.setdefault("DJANGO_SETTINGS_MODULE", "foobar.settings") foobar is a folder with an __init__.py file and settings is a folder with an __init__.py file that contains my secrets and imports dev.py, which then imports common.py. EDIT Nevermind, I didn't have a module installed that was required. My bad! This works great!!Convertiplane
Two things: 1) better to set Debug=True in your dev.py rather than =False in your prod.py. 2) Rather than switching in init.py, switch using the DJANGO_SETTINGS_MODULE environment var. This will help with PAAS deployments (e.g. Heroku).Geisel
When I use this setup in django 1.8.4 and try runserver I get "django.core.exceptions.ImproperlyConfigured: The SECRET_KEY setting must not be empty.", even doh I have SECRET_KEY on my init.py file. Am I missing something?Shortcake
isn't a the use of something like AWS_SECRET_ACCESS_KEY = os.getenv("AWS_SECRET_ACCESS_KEY") more secure? Honest question - I know why you don't want it versioned, but the other alternative is to get it from the environment. Which begs the question of setting the environment variable, of course, but that can be left to your deployment mechanism, no?Casaleggio
C
20

I use a slightly modified version of the "if DEBUG" style of settings that Harper Shelby posted. Obviously depending on the environment (win/linux/etc.) the code might need to be tweaked a bit.

I was in the past using the "if DEBUG" but I found that occasionally I needed to do testing with DEUBG set to False. What I really wanted to distinguish if the environment was production or development, which gave me the freedom to choose the DEBUG level.

PRODUCTION_SERVERS = ['WEBSERVER1','WEBSERVER2',]
if os.environ['COMPUTERNAME'] in PRODUCTION_SERVERS:
    PRODUCTION = True
else:
    PRODUCTION = False

DEBUG = not PRODUCTION
TEMPLATE_DEBUG = DEBUG

# ...

if PRODUCTION:
    DATABASE_HOST = '192.168.1.1'
else:
    DATABASE_HOST = 'localhost'

I'd still consider this way of settings a work in progress. I haven't seen any one way to handling Django settings that covered all the bases and at the same time wasn't a total hassle to setup (I'm not down with the 5x settings files methods).

Continuation answered 26/10, 2009 at 18:34 Comment(5)
This is the kind of thing that Django's settings being an actual code file allows, and I was hinting at. I haven't done anything like this myself, but it's definitely the sort of solution that might be a better general answer than mine.Antiquated
I just ran into this for the first time and chose to (successfully!) use your solution, with a slight difference: I used uuid.getnode() to find uuid of my system. So I'm testing if uuid.getnode() == 12345678901 (actually a different number) instead of the os.environ test you used. I couldn't find documenation to convince me that os.environ['COMPUTERNAME'] is unique per computer.Fraktur
os.environ['COMPUTERNAME'] doesn't work on Amazon AWS Ubuntu. I get a KeyError.Vanillic
When using the UUID this solution has proven to be the best and simplest for me. It doesn't require lots of complicated and over-modularized patchwork. In a production environment, you still need to place your database passwords and SECRET_KEY in a separate file that resides outside of version control.Vanillic
os.environ['COMPUTERNAME'] unfortunately does not work on PythonAnywhere. You get a KeyError.Beeswax
N
16

TL;DR: The trick is to modify os.environment before you import settings/base.py in any settings/<purpose>.py, this will greatly simplify things.


Just thinking about all these intertwining files gives me a headache. Combining, importing (sometimes conditionally), overriding, patching of what was already set in case DEBUG setting changed later on. What a nightmare!

Through the years I went through all different solutions. They all somewhat work, but are so painful to manage. WTF! Do we really need all that hassle? We started with just one settings.py file. Now we need a documentation just to correctly combine all these together in a correct order!

I hope I finally hit the (my) sweet spot with the solution below.

Let's recap the goals (some common, some mine)

  1. Keep secrets a secret — don't store them in a repo!

  2. Set/read keys and secrets through environment settings, 12 factor style.

  3. Have sensible fallback defaults. Ideally for local development you don't need anything more beside defaults.

  4. …but try to keep defaults production safe. It's better to miss a setting override locally, than having to remember to adjust default settings safe for production.

  5. Have the ability to switch DEBUG on/off in a way that can have an effect on other settings (eg. using javascript compressed or not).

  6. Switching between purpose settings, like local/testing/staging/production, should be based only on DJANGO_SETTINGS_MODULE, nothing more.

  7. …but allow further parameterization through environment settings like DATABASE_URL.

  8. …also allow them to use different purpose settings and run them locally side by side, eg. production setup on local developer machine, to access production database or smoke test compressed style sheets.

  9. Fail if an environment variable is not explicitly set (requiring an empty value at minimum), especially in production, eg. EMAIL_HOST_PASSWORD.

  10. Respond to default DJANGO_SETTINGS_MODULE set in manage.py during django-admin startproject

  11. Keep conditionals to a minimum, if the condition is the purposed environment type (eg. for production set log file and it's rotation), override settings in associated purposed settings file.

Do not's

  1. Do not let django read DJANGO_SETTINGS_MODULE setting form a file.
    Ugh! Think of how meta this is. If you need to have a file (like docker env) read that into the environment before staring up a django process.

  2. Do not override DJANGO_SETTINGS_MODULE in your project/app code, eg. based on hostname or process name.
    If you are lazy to set environment variable (like for setup.py test) do it in tooling just before you run your project code.

  3. Avoid magic and patching of how django reads it's settings, preprocess the settings but do not interfere afterwards.

  4. No complicated logic based nonsense. Configuration should be fixed and materialized not computed on the fly. Providing a fallback defaults is just enough logic here.
    Do you really want to debug, why locally you have correct set of settings but in production on a remote server, on one of hundred machines, something computed differently? Oh! Unit tests? For settings? Seriously?

Solution

My strategy consists of excellent django-environ used with ini style files, providing os.environment defaults for local development, some minimal and short settings/<purpose>.py files that have an import settings/base.py AFTER the os.environment was set from an INI file. This effectively give us a kind of settings injection.

The trick here is to modify os.environment before you import settings/base.py.

To see the full example go do the repo: https://github.com/wooyek/django-settings-strategy

.
│   manage.py
├───data
└───website
    ├───settings
    │   │   __init__.py   <-- imports local for compatibility
    │   │   base.py       <-- almost all the settings, reads from proces environment 
    │   │   local.py      <-- a few modifications for local development
    │   │   production.py <-- ideally is empty and everything is in base 
    │   │   testing.py    <-- mimics production with a reasonable exeptions
    │   │   .env          <-- for local use, not kept in repo
    │   __init__.py
    │   urls.py
    │   wsgi.py

settings/.env

A defaults for local development. A secret file, to mostly set required environment variables. Set them to empty values if they are not required in local development. We provide defaults here and not in settings/base.py to fail on any other machine if the're missing from the environment.

settings/local.py

What happens in here, is loading environment from settings/.env, then importing common settings from settings/base.py. After that we can override a few to ease local development.

import logging
import environ

logging.debug("Settings loading: %s" % __file__)

# This will read missing environment variables from a file
# We wan to do this before loading a base settings as they may depend on environment
environ.Env.read_env(DEBUG='True')

from .base import *

ALLOWED_HOSTS += [
    '127.0.0.1',
    'localhost',
    '.example.com',
    'vagrant',
    ]

# https://docs.djangoproject.com/en/1.6/topics/email/#console-backend
EMAIL_BACKEND = 'django.core.mail.backends.console.EmailBackend'
# EMAIL_BACKEND = 'django.core.mail.backends.dummy.EmailBackend'

LOGGING['handlers']['mail_admins']['email_backend'] = 'django.core.mail.backends.dummy.EmailBackend'

# Sync task testing
# http://docs.celeryproject.org/en/2.5/configuration.html?highlight=celery_always_eager#celery-always-eager

CELERY_ALWAYS_EAGER = True
CELERY_EAGER_PROPAGATES_EXCEPTIONS = True

settings/production.py

For production we should not expect an environment file, but it's easier to have one if we're testing something. But anyway, lest's provide few defaults inline, so settings/base.py can respond accordingly.

environ.Env.read_env(Path(__file__) / "production.env", DEBUG='False', ASSETS_DEBUG='False')
from .base import *

The main point of interest here are DEBUG and ASSETS_DEBUG overrides, they will be applied to the python os.environ ONLY if they are MISSING from the environment and the file.

These will be our production defaults, no need to put them in the environment or file, but they can be overridden if needed. Neat!

settings/base.py

These are your mostly vanilla django settings, with a few conditionals and lot's of reading them from the environment. Almost everything is in here, keeping all the purposed environments consistent and as similar as possible.

The main differences are below (I hope these are self explanatory):

import environ

# https://github.com/joke2k/django-environ
env = environ.Env()

# Build paths inside the project like this: os.path.join(BASE_DIR, ...)
BASE_DIR = os.path.dirname(os.path.dirname(os.path.dirname(os.path.abspath(__file__))))

# Where BASE_DIR is a django source root, ROOT_DIR is a whole project root
# It may differ BASE_DIR for eg. when your django project code is in `src` folder
# This may help to separate python modules and *django apps* from other stuff
# like documentation, fixtures, docker settings
ROOT_DIR = BASE_DIR

# Quick-start development settings - unsuitable for production
# See https://docs.djangoproject.com/en/1.11/howto/deployment/checklist/

# SECURITY WARNING: keep the secret key used in production secret!
SECRET_KEY = env('SECRET_KEY')

# SECURITY WARNING: don't run with debug turned on in production!
DEBUG = env('DEBUG', default=False)

INTERNAL_IPS = [
    '127.0.0.1',
]

ALLOWED_HOSTS = []

if 'ALLOWED_HOSTS' in os.environ:
    hosts = os.environ['ALLOWED_HOSTS'].split(" ")
    BASE_URL = "https://" + hosts[0]
    for host in hosts:
        host = host.strip()
        if host:
            ALLOWED_HOSTS.append(host)

SECURE_SSL_REDIRECT = env.bool('SECURE_SSL_REDIRECT', default=False)

# Database
# https://docs.djangoproject.com/en/1.11/ref/settings/#databases

if "DATABASE_URL" in os.environ:  # pragma: no cover
    # Enable database config through environment
    DATABASES = {
        # Raises ImproperlyConfigured exception if DATABASE_URL not in os.environ
        'default': env.db(),
    }

    # Make sure we use have all settings we need
    # DATABASES['default']['ENGINE'] = 'django.contrib.gis.db.backends.postgis'
    DATABASES['default']['TEST'] = {'NAME': os.environ.get("DATABASE_TEST_NAME", None)}
    DATABASES['default']['OPTIONS'] = {
        'options': '-c search_path=gis,public,pg_catalog',
        'sslmode': 'require',
    }
else:
    DATABASES = {
        'default': {
            'ENGINE': 'django.db.backends.sqlite3',
            # 'ENGINE': 'django.contrib.gis.db.backends.spatialite',
            'NAME': os.path.join(ROOT_DIR, 'data', 'db.dev.sqlite3'),
            'TEST': {
                'NAME': os.path.join(ROOT_DIR, 'data', 'db.test.sqlite3'),
            }
        }
    }

STATIC_ROOT = os.path.join(ROOT_DIR, 'static')

# django-assets
# http://django-assets.readthedocs.org/en/latest/settings.html

ASSETS_LOAD_PATH = STATIC_ROOT
ASSETS_ROOT = os.path.join(ROOT_DIR, 'assets', "compressed")
ASSETS_DEBUG = env('ASSETS_DEBUG', default=DEBUG)  # Disable when testing compressed file in DEBUG mode
if ASSETS_DEBUG:
    ASSETS_URL = STATIC_URL
    ASSETS_MANIFEST = "json:{}".format(os.path.join(ASSETS_ROOT, "manifest.json"))
else:
    ASSETS_URL = STATIC_URL + "assets/compressed/"
    ASSETS_MANIFEST = "json:{}".format(os.path.join(STATIC_ROOT, 'assets', "compressed", "manifest.json"))
ASSETS_AUTO_BUILD = ASSETS_DEBUG
ASSETS_MODULES = ('website.assets',)

The last bit shows the power here. ASSETS_DEBUG has a sensible default, which can be overridden in settings/production.py and even that that can be overridden by an environment setting! Yay!

In effect we have a mixed hierarchy of importance:

  1. settings/.py - sets defaults based on purpose, does not store secrets
  2. settings/base.py - is mostly controlled by environment
  3. process environment settings - 12 factor baby!
  4. settings/.env - local defaults for easy startup
Norma answered 12/5, 2017 at 12:42 Comment(3)
Hey Janusz... so in the .env file would go all the API keys and auth keys and passwords etc? Just like TWILLIO_API = "abc123"? Or TWILLIO_API = env("TWILLIO_API")?Beryllium
Yes, but this is only a fallback for environment settings. This file is comes handy for development but is not saved in repo or pushed to production where you should strictly use environment settings or your platform equivalent that will in turn set environment settings for the server process.Norma
How to define production settings? For example when I'm explicitly defining my DJANGO_SETTINGS_MODULE as website/settings/production, the init file is still loading the local.py settings. How can I avoid it, or am I doing something wrong? @JanuszSkoniecznyNitroso
A
14

I use a settings_local.py and a settings_production.py. After trying several options I've found that it's easy to waste time with complex solutions when simply having two settings files feels easy and fast.

When you use mod_python/mod_wsgi for your Django project you need to point it to your settings file. If you point it to app/settings_local.py on your local server and app/settings_production.py on your production server then life becomes easy. Just edit the appropriate settings file and restart the server (Django development server will restart automatically).

Aviculture answered 26/10, 2009 at 18:7 Comment(5)
And what about the local development server? is there a way to tell the django webserver (run using python manage.py runserver), which settings file to use?Machado
@Machado if you add --settings=[module name] (no .py extension) to the end of the runserver command you can specify which settings file to use. If you're going to do that, do yourself a favor and make a shell script/batch file with the development settings configured. Trust me, your fingers will thank you.Continuation
this is the solution I use. hacking up a settings file to be used for both production or development is messyGrisham
I think its better to use settings.py in development, as you don't have to specify it all the time.Margarethe
Am I correct in assuming this method requires importing of the settings module via the proxy, django.conf.settings? Otherwise you'd need to edit import declarations to point at the correct settings file when pushing live.Dalston
A
7

Remember that settings.py is a live code file. Assuming that you don't have DEBUG set on production (which is a best practice), you can do something like:

if DEBUG:
    STATIC_PATH = /path/to/dev/files
else:
    STATIC_PATH = /path/to/production/files

Pretty basic, but you could, in theory, go up to any level of complexity based on just the value of DEBUG - or any other variable or code check you wanted to use.

Antiquated answered 26/10, 2009 at 18:5 Comment(0)
C
7

I manage my configurations with the help of django-split-settings.

It is a drop-in replacement for the default settings. It is simple, yet configurable. And refactoring of your exisitng settings is not required.

Here's a small example (file example/settings/__init__.py):

from split_settings.tools import optional, include
import os

if os.environ['DJANGO_SETTINGS_MODULE'] == 'example.settings':
    include(
        'components/default.py',
        'components/database.py',
        # This file may be missing:
        optional('local_settings.py'),

        scope=globals()
    )

That's it.

Update

I wrote a blog post about managing django's settings with django-split-sttings. Have a look!

Cumulostratus answered 2/11, 2015 at 19:21 Comment(5)
I tried that.. ran into a wall once i tried to run my django unit tests.. i just couldn't figure out how to specify which settings file to read fromGottschalk
I have created a gist for you: gist.github.com/sobolevn/006c734f0520439a4b6c16891d65406cCumulostratus
i got something like this in my code, so i check the settings.DEBUG flag to know if i wanna import stuff.. that flag is always set to false in django unit tests (see here) so my work around is to override them at each test like soGottschalk
here is another question though: my uwsgi.ini file has different settings across dev/prod.. any idea of how to make it pick values from my settings file?Gottschalk
sorry, i don't get the setup. you can ask a separate question with more details and i will try to help you.Cumulostratus
J
6

The problem with most of these solutions is that you either have your local settings applied before the common ones, or after them.

So it's impossible to override things like

  • the env-specific settings define the addresses for the memcached pool, and in the main settings file this value is used to configure the cache backend
  • the env-specific settings add or remove apps/middleware to the default one

at the same time.

One solution can be implemented using "ini"-style config files with the ConfigParser class. It supports multiple files, lazy string interpolation, default values and a lot of other goodies. Once a number of files have been loaded, more files can be loaded and their values will override the previous ones, if any.

You load one or more config files, depending on the machine address, environment variables and even values in previously loaded config files. Then you just use the parsed values to populate the settings.

One strategy I have successfully used has been:

  • Load a default defaults.ini file
  • Check the machine name, and load all files which matched the reversed FQDN, from the shortest match to the longest match (so, I loaded net.ini, then net.domain.ini, then net.domain.webserver01.ini, each one possibly overriding values of the previous). This account also for developers' machines, so each one could set up its preferred database driver, etc. for local development
  • Check if there is a "cluster name" declared, and in that case load cluster.cluster_name.ini, which can define things like database and cache IPs

As an example of something you can achieve with this, you can define a "subdomain" value per-env, which is then used in the default settings (as hostname: %(subdomain).whatever.net) to define all the necessary hostnames and cookie things django needs to work.

This is as DRY I could get, most (existing) files had just 3 or 4 settings. On top of this I had to manage customer configuration, so an additional set of configuration files (with things like database names, users and passwords, assigned subdomain etc) existed, one or more per customer.

One can scale this as low or as high as necessary, you just put in the config file the keys you want to configure per-environment, and once there's need for a new config, put the previous value in the default config, and override it where necessary.

This system has proven reliable and works well with version control. It has been used for long time managing two separate clusters of applications (15 or more separate instances of the django site per machine), with more than 50 customers, where the clusters were changing size and members depending on the mood of the sysadmin...

Jerrylee answered 1/3, 2012 at 14:18 Comment(2)
Do you have an example of how you load the settings from the ini into Django's settings?Bega
See docs.python.org/2/library/configparser.html . You can load a parser with config = ConfigParser.ConfigParser() then read your files config.read(array_of_filenames) and get values using config.get(section, option). So first you load your config, and then you use it to read values for settings.Jerrylee
S
5

I am also working with Laravel and I like the implementation there. I tried to mimic it and combining it with the solution proposed by T. Stone (look above):

PRODUCTION_SERVERS = ['*.webfaction.com','*.whatever.com',]

def check_env():
    for item in PRODUCTION_SERVERS:
        match = re.match(r"(^." + item + "$)", socket.gethostname())
        if match:
            return True

if check_env():
    PRODUCTION = True
else:
    PRODUCTION = False

DEBUG = not PRODUCTION

Maybe something like this would help you.

Supertonic answered 4/8, 2014 at 8:38 Comment(0)
M
4

My solution to that problem is also somewhat of a mix of some solutions already stated here:

  • I keep a file called local_settings.py that has the content USING_LOCAL = True in dev and USING_LOCAL = False in prod
  • In settings.py I do an import on that file to get the USING_LOCAL setting

I then base all my environment-dependent settings on that one:

DEBUG = USING_LOCAL
if USING_LOCAL:
    # dev database settings
else:
    # prod database settings

I prefer this to having two separate settings.py files that I need to maintain as I can keep my settings structured in a single file easier than having them spread across several files. Like this, when I update a setting I don't forget to do it for both environments.

Of course that every method has its disadvantages and this one is no exception. The problem here is that I can't overwrite the local_settings.py file whenever I push my changes into production, meaning I can't just copy all files blindly, but that's something I can live with.

Molt answered 26/10, 2009 at 20:10 Comment(0)
A
4

For most of my projects I use following pattern:

  1. Create settings_base.py where I store settings that are common for all environments
  2. Whenever I need to use new environment with specific requirements I create new settings file (eg. settings_local.py) which inherits contents of settings_base.py and overrides/adds proper settings variables (from settings_base import *)

(To run manage.py with custom settings file you simply use --settings command option: manage.py <command> --settings=settings_you_wish_to_use.py)

Advowson answered 3/4, 2011 at 9:53 Comment(0)
F
4

1 - Create a new folder inside your app and name settings to it.

2 - Now create a new __init__.py file in it and inside it write

from .base import *

try:
    from .local import *
except:
    pass

try:
    from .production import *
except:
    pass

3 - Create three new files in the settings folder name local.py and production.py and base.py.

4 - Inside base.py, copy all the content of previous settings.py folder and rename it with something different, let's say old_settings.py.

5 - In base.py change your BASE_DIR path to point to your new path of setting

Old path-> BASE_DIR = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))

New path -> BASE_DIR = os.path.dirname(os.path.dirname(os.path.dirname(os.path.abspath(__file__)))

This way, the project dir can be structured and can be manageable among production and local development.

Fine answered 10/6, 2017 at 4:32 Comment(0)
O
3

I use a variation of what jpartogi mentioned above, that I find a little shorter:

import platform
from django.core.management import execute_manager 

computername = platform.node()

try:
  settings = __import__(computername + '_settings')
except ImportError: 
  import sys
  sys.stderr.write("Error: Can't find the file '%r_settings.py' in the directory containing %r. It appears you've customized things.\nYou'll have to run django-admin.py, passing it your settings module.\n(If the file local_settings.py does indeed exist, it's causing an ImportError somehow.)\n" % (computername, __file__))
  sys.exit(1)

if __name__ == "__main__":
  execute_manager(settings)

Basically on each computer (development or production) I have the appropriate hostname_settings.py file that gets dynamically loaded.

Ottoman answered 6/4, 2011 at 19:48 Comment(0)
W
3

There is also Django Classy Settings. I personally am a big fan of it. It's built by one of the most active people on the Django IRC. You would use environment vars to set things.

http://django-classy-settings.readthedocs.io/en/latest/

Williamsen answered 21/2, 2017 at 17:56 Comment(0)
B
3

Making multiple versions of settings.py is an anti pattern for 12 Factor App methodology. use python-decouple or django-environ instead.

Bouncy answered 8/2, 2019 at 19:39 Comment(0)
H
2

In order to use different settings configuration on different environment, create different settings file. And in your deployment script, start the server using --settings=<my-settings.py> parameter, via which you can use different settings on different environment.

Benefits of using this approach:

  1. Your settings will be modular based on each environment

  2. You may import the master_settings.py containing the base configuration in the environmnet_configuration.py and override the values that you want to change in that environment.

  3. If you have huge team, each developer may have their own local_settings.py which they can add to the code repository without any risk of modifying the server configuration. You can add these local settings to .gitnore if you use git or .hginore if you Mercurial for Version Control (or any other). That way local settings won't even be the part of actual code base keeping it clean.

Helsinki answered 9/11, 2016 at 22:40 Comment(0)
K
2

I had my settings split as follows

settings/
     |
     |- base.py
     |- dev.py
     |- prod.py  

We have 3 environments

  • dev
  • staging
  • production

Now obviously staging and production should have the maximum possible similar environment. So we kept prod.py for both.

But there was a case where I had to identify running server is a production server. @T. Stone 's answer helped me write check as follows.

from socket import gethostname, gethostbyname  
PROD_HOSTS = ["webserver1", "webserver2"]

DEBUG = False
ALLOWED_HOSTS = [gethostname(), gethostbyname(gethostname()),]


if any(host in PROD_HOSTS for host in ALLOWED_HOSTS):
    SESSION_COOKIE_SECURE = True
    CSRF_COOKIE_SECURE = True  
Killie answered 12/8, 2017 at 16:39 Comment(0)
I
1

I differentiate it in manage.py and created two separate settings file: local_settings.py and prod_settings.py.

In manage.py I check whether the server is local server or production server. If it is a local server it would load up local_settings.py and it is a production server it would load up prod_settings.py. Basically this is how it would look like:

#!/usr/bin/env python
import sys
import socket
from django.core.management import execute_manager 

ipaddress = socket.gethostbyname( socket.gethostname() )
if ipaddress == '127.0.0.1':
    try:
        import local_settings # Assumed to be in the same directory.
        settings = local_settings
    except ImportError:
        import sys
        sys.stderr.write("Error: Can't find the file 'local_settings.py' in the directory containing %r. It appears you've customized things.\nYou'll have to run django-admin.py, passing it your settings module.\n(If the file local_settings.py does indeed exist, it's causing an ImportError somehow.)\n" % __file__)
        sys.exit(1)
else:
    try:
        import prod_settings # Assumed to be in the same directory.
        settings = prod_settings    
    except ImportError:
        import sys
        sys.stderr.write("Error: Can't find the file 'prod_settings.py' in the directory containing %r. It appears you've customized things.\nYou'll have to run django-admin.py, passing it your settings module.\n(If the file prod_settings.py does indeed exist, it's causing an ImportError somehow.)\n" % __file__)
        sys.exit(1)

if __name__ == "__main__":
    execute_manager(settings)

I found it to be easier to separate the settings file into two separate file instead of doing lots of ifs inside the settings file.

Inscribe answered 27/10, 2009 at 11:52 Comment(0)
O
1

As an alternative to maintain different file if you wiil: If you are using git or any other VCS to push codes from local to server, what you can do is add the settings file to .gitignore.

This will allow you to have different content in both places without any problem. SO on server you can configure an independent version of settings.py and any changes made on the local wont reflect on server and vice versa.

In addition, it will remove the settings.py file from github also, the big fault, which i have seen many newbies doing.

Oddball answered 3/5, 2016 at 6:15 Comment(0)
E
0

I think the best solution is suggested by @T. Stone, but I don't know why just don't use the DEBUG flag in Django. I Write the below code for my website:

if DEBUG:
    from .local_settings import *

Always the simple solutions are better than complex ones.

Exodontics answered 25/2, 2019 at 5:0 Comment(0)
R
0

Use in your .env file

DJANGO_SETTINGS_MODULE=config.settings.prod

that will take by default all the configuration of your prod.py file. and take that value when you have:

os.environ.setdefault("DJANGO_SETTINGS_MODULE", "config.settings.local")

when run your project you will see that inside the console it will point to your new configuration file indicated in the .env file

Django version 4.2.4, using settings 'config.settings.prod'
Starting development server at http://127.0.0.1:8000/
Quit the server with CONTROL-C.
Rosas answered 20/9, 2023 at 23:9 Comment(0)
C
-2

I found the responses here very helpful. (Has this been more definitively solved? The last response was a year ago.) After considering all the approaches listed, I came up with a solution that I didn't see listed here.

My criteria were:

  • Everything should be in source control. I don't like fiddly bits lying around.
  • Ideally, keep settings in one file. I forget things if I'm not looking right at them :)
  • No manual edits to deploy. Should be able to test/push/deploy with a single fabric command.
  • Avoid leaking development settings into production.
  • Keep as close as possible to "standard" (*cough*) Django layout as possible.

I thought switching on the host machine made some sense, but then figured the real issue here is different settings for different environments, and had an aha moment. I put this code at the end of my settings.py file:

try:
    os.environ['DJANGO_DEVELOPMENT_SERVER'] # throws error if unset
    DEBUG = True
    TEMPLATE_DEBUG = True
    # This is naive but possible. Could also redeclare full app set to control ordering. 
    # Note that it requires a list rather than the generated tuple.
    INSTALLED_APPS.extend([
        'debug_toolbar',
        'django_nose',
    ])
    # Production database settings, alternate static/media paths, etc...
except KeyError: 
    print 'DJANGO_DEVELOPMENT_SERVER environment var not set; using production settings'

This way, the app defaults to production settings, which means you are explicitly "whitelisting" your development environment. It is much safer to forget to set the environment variable locally than if it were the other way around and you forgot to set something in production and let some dev settings be used.

When developing locally, either from the shell or in a .bash_profile or wherever:

$ export DJANGO_DEVELOPMENT_SERVER=yep

(Or if you're developing on Windows, set via the Control Panel or whatever its called these days... Windows always made it so obscure that you could set environment variables.)

With this approach, the dev settings are all in one (standard) place, and simply override the production ones where needed. Any mucking around with development settings should be completely safe to commit to source control with no impact on production.

Constituent answered 17/2, 2014 at 6:43 Comment(1)
Better to just maintain different config files, and pick using the DJango standard env variable DJANGO_SETTINGS_MODULEGeisel

© 2022 - 2024 — McMap. All rights reserved.