Private settings in Django and Deployment
Asked Answered
R

4

10

I am using Django and deploy my stack with Ansible. Finally, I am using Fabric to deploy my Django project pulling my code from GitHub.

My question: What is the best practice to deal with private settings in Django's settings.py file, such as passwords for email or S3? Currently, I file-transfer a settings_production.py from my machine to the production machine at the end of my deployment script before restarting the application server. This file contains the settings that I am not putting into settings.py as part of the repo.

At the end of my settings.py I am adding something like

try:
    from settings_production import *
except ImportError:
    pass

Are there better ways to do this?

Rhizome answered 16/10, 2015 at 20:57 Comment(0)
D
8

The answer is: http://12factor.net/config.

You should manage code-related differences between environments via different settings modules. An example of this would be adding debug_toolbar to INSTALLED_APPS locally, while removing it in production. To handle this aspect, rather than using the old try: import except ImportError: ... idiom and keeping an out-of-version-control local_settings.py on your local machine, you should instead keep all of your settings modules in version control, including your local settings. Then, in wsgi.py and manage.py, use os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'myproject.conf.local') to default your project to use local settings. In dev / production, you add an environment variable to use the respective settings module (e.g., DJANGO_SETTINGS_MODULE=myproject.conf.dev).

When you use 12 Factor, it's no longer necessary to keep certain settings modules out of version control, because, with 12 Factor, you don't put any passwords or sensitive settings directly into a settings module. You instead keep them in the environment and access them like this:

# Inside of a settings module
FOO_PASSWORD = os.environ['FOO_PASSWORD']

In environments like Heroku, this setup is simple, because you can enter config vars for your app via the web interface.

I recommend pretty much all of 12 Factor's principles, esp things like disposability, logs, and config.

Reasonable sacrifice

If you'd like to maintain an extra settings module, out of version control, to avoid having to use environment variables during local dev (I don't blame you), you can still follow the above principles and also add, to the bottom of the local settings module that is in version control, try: from some_other_local import * except: pass. This will allow you to set only the necessary override settings locally, while still keeping the rest of your local settings (e.g., local database, relative static / media file paths, installed apps, etc.) in version control, which gives you the best of both worlds.

Extra Resources

Disrepair answered 16/10, 2015 at 22:15 Comment(4)
This works well for me. Except: If I deploy my SECRET_KEY as environmental variable through Supervisor's (environment=) setting, I loose the capability to perform Django housekeeping tasks on the server. python manage.py collectstatic fails with: ImproperlyConfigured("The SECRET_KEY setting must not be empty."). Any suggestions how to set this up properly. Or call manage.py.Rhizome
@FalkSchuetzenmeister you can add them to the environment in your shell via export for thatDisrepair
Yes, It just seems to defy the purpose of setting the environmental variables in my supervisor configuration file because I have to set them up in just another place.Rhizome
@FalkSchuetzenmeister the ideal solution to this is to manage your env vars through a separate service and hook into that via supervisord as well as via a .bash_rc script, but in the meantime you can manually enter them instead of having to set all that upDisrepair
A
1

I think a good example of how to do it is jcalazan/ansible-django-stack which besides the code includes a few links, in particular this one about How to deploy encrypted copies of your SSL keys and other files with Ansible and OpenSSL.

Aoudad answered 16/10, 2015 at 22:19 Comment(0)
C
0

I think you could create a settings.py, then in there you do:

try:
    from local_settings import *
except ImportError:
    pass

You have to put this at the END of the settings.py. For your development environment, you create local_settings.py, in there you override all production configs to your local stuff. In this way you could track the changes of your production settings while keeping your local as flexible as possible.

The only problem though, is that if you accidentally forgot to override the settings.py in your local_settings.py, you might end up using production settings, which might be harmful.

For me, I just add this in my local ~/.bashrc to make sure that django is always using local_settings.py:

export DJANGO_SETTINGS_MODULE=app.settings.local_settings

Edit:

If you don't want to repo to track the changes and you don't want to touch the production server, I don't think there's any better way to copy the settings file. After all, your changes have to be moved from your computer to the production in some way! Maybe you could rsync the file but it's no better than fabric put, right?

Coparcenary answered 16/10, 2015 at 21:32 Comment(1)
That is what I am doing the settings_local.py for production sets the values that I do not want to commit to the git repo. Sorry for being imprecise.Rhizome
R
0

All configuration should be done with environment variables because:

  1. They are language agnostic
  2. They are usually less likely to be checked into source control by accident
  3. In certain deployment environments that is the only form of configuration you have available

So the idea is to make your deployments fully configurable with env var flags.

In a development environment direnv is an easy way to set env vars by directory.

You can additionally have a mechanism which allows devs to override settings with a gitignored python file in their development environments:

# the manage.py file
import os

if __name__ == '__main__':
    if os.path.isfile('local_overrides.py'):
        os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'local_overrides')
    else:
        os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'settings')
    execute_from_command_line(sys.argv)

And the local_overrides file:

# the local_overrides.py file
import os

os.environ["DB_NAME"] = "abc"
from settings import *
INSTALLED_APPS.append('xyz')

Note that this differs from other answers which suggest adding a try/import at the end. Here the local_overrides file takes full control, and this set up has three advantages:

  1. You can set env vars prior to importing from the settings module from the convenience of a python file using os.environ
  2. Because its in a python file it triggers hot reloading, whereas other env var manipulation tools require a restart of your dev server
  3. You can import from any number of settings modules, which makes it easier to experiment things

This complies with the 12factor recommendation but adds developer convenience.

Note that 12factor recommend against settings modules per environment. I would argue there is a difference between environment and "running mode" of which there are essentially three: on a server, in dev mode, or as unit tests.

So avoid "production" settings as a python module, and prefer "server" as a python module serving all server environments because they share similarities, but use granular env vars to differentiate those from one another.

Rarity answered 3/9, 2020 at 12:32 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.