How can we make Django tests faster?
Asked Answered
W

6

6

We are using Django 1.4 with PostgreSQL on Ubuntu 12.04. We have many tests, and the problem is that running the tests is very slow, I think because for each test the database is created from scratch. I want to make the tests faster by running them with the database in memory (not hard disk). How do I do it? Do you have any links or tutorials?

Worthless answered 24/3, 2014 at 14:28 Comment(2)
Also check this out: nemesisdesign.net/blog/coding/…Tessin
My tricks: eatsomecode.com/faster-django-testsMcripley
T
11

In Django 1.9 if you have a multi-core processor a great option is the flag:

--parallel

This requires you to pip install tblib but will let you run your unit tests simultaneously on multiple cores. (https://docs.djangoproject.com/en/1.10/ref/django-admin/#cmdoption-test-parallel)

Another great option for Django 1.8+ is the flag:

--keepdb

It reuses your test database, stopping the long wait time caused by creating a new test database each time you run tests. (https://docs.djangoproject.com/en/1.10/ref/django-admin/#cmdoption-test-keepdb

Tavern answered 10/3, 2017 at 16:56 Comment(1)
It's worth to mention that while --keep-db makes the tests faster, it should only be used locally and actual tests (e.g. the ones you run in CICD) should not use it. It could prevent you from discovering some issues, especially models and migrations related.Enlace
P
3

The best option is to have a separate settings file for your tests. In settings_test.py you tell it to use sqlite which by default use an in-memory database:

from base_settings import *

DATABASES = {
  'default': {
    'ENGINE': 'django.db.backends.sqlite3',
    'NAME': ':memory'
  }
}

And then run your tests by adding --settings=settings_test

See also the Django docs:
https://docs.djangoproject.com/en/dev/topics/testing/overview/#the-test-database

Praetorian answered 24/3, 2014 at 15:9 Comment(3)
-1. What's the point of testing using SQLite (which assumes you're not using anything that it won't understand)? If you then deploy using Postgres, how can you sure that your DB-related scripts are working using your tests?Gowon
Uri asked how to optimise his tests using an in-memory database and using sqlite in-memory database for testing is the standard method in Django. If he is using Django one would assume he is using the Django ORM which will mean the majority of his application should be database agnostic (so using sqlite for tests and postgres for deployment is not an issue - in fact it a common way of doing things). see e.g. seanhayes.name/2012/11/28/ways-speed-django-testsPraetorian
In my experience, running tests in SQLite in-memory often actually makes running single tests takes longer. This is especially if you have lots of migrations, using in-memory SQLite means you need to redo the whole migrations with every test runs. Running with --keepdb in a persistent Postgres instance is often just as fast or faster than transient, in-memory SQLite.Sallet
T
2

There are a couple of SO threads that are helpful:

I definitely do use the SQLite trick to do sanity checks, but if you're doing anything database-specific it will drive you nuts: certain SQL differences, differences is data precision, etc. It also undercuts the point of testing: if you're using the tests to reassure you the change will work once pushed to production, running them against a different database isn't a good way to do that. Try using nose to skip database recreation when possible and optimize your local Postgres setup. You can try avoiding the DB altogether as well.

The thing that works best for me is trying to see the downtime caused by testing as an opportunity to come up with better changes and a way to encourage me to think about what I'm changing before firing up the test runner.

Thingumabob answered 24/3, 2014 at 15:15 Comment(0)
A
2

Fast forward to 2016 and we have a very nice option in manage.py to speed up tests.

--keepdb, -k¶ New in Django 1.8. Preserves the test database between test runs. This has the advantage of skipping both the create and destroy actions which can greatly decrease the time to run tests, especially those in a large test suite. If the test database does not exist, it will be created on the first run and then preserved for each subsequent run. Any unapplied migrations will also be applied to the test database before running the test suite.

If you are not using TransactionTestCase and it's subclasses a large portion of the test running time will come from database creation. If you have a large number of migrations it's going to be really bad. but you and avoid all that with

 ./manage.py test -k myapp
Aquitaine answered 23/4, 2016 at 15:49 Comment(0)
S
0

You can simply change the database for tests to sqlite:

import sys
if 'test' in sys.argv:
    DATABASES['default']['engine'] = 'sqlite3'

Note, that some of your tests can fail due to some incompatibilities between databases, but generally this should work.

Sletten answered 24/3, 2014 at 15:4 Comment(3)
-1. What's the point of testing using SQLite (which assumes you're not using anything that it won't understand)? If you then deploy using Postgres, how can you sure that your DB-related scripts are working using your tests?Gowon
Well, if you have any DB-related scripts this won't work, but for simple/regular use cases this is fast and reliable. If you're concerned about differences between these two DB-engines, you can always run tests on Postgres to check if there are any.Sletten
Countering the downvote, as it's normally business logic and not the ORM that needs testing. Sure, raw database-specific SQL can break, but this is fairly marginal case; this has to be hand-crafted and you should know what you do. Saving 8s on every unit test run (tens times a day per developer!) saves you way more resources immediately.Ezequiel
G
0

Not familiar with python or Django, but conceptually, you should be able to:

  1. Create your database and load your fixtures once when you bootstrap your tests
  2. Define a savepoint.
  3. After each test or group of test, rollback to that savepoint.

(You might need to override the ORM's begin/end transaction code for the duration of the tests if it doesn't support savepoints.)

http://www.postgresql.org/docs/current/static/sql-savepoint.html

(I'd add that at a conceptual level, your DBAL and ORM should get mocked in your tests, so that you're testing your component in isolation. Which is to say, you probably shouldn't be connecting to the database to begin with in most of your tests.)

Gowon answered 24/3, 2014 at 15:32 Comment(4)
The problem with using rollback is that any test that only happens within a single transaction cannot be a very thorough test.Phonemics
@jjanes: hence the savepoint instead of a single transaction: you can nest those at will.Gowon
But that still only works within a single transaction, doesn't it? Once it is committed, it can't be rolled back, and until it is committed a side transaction can't see it. I think any complete test suite would have to include multi-connection visibility/consistency tests.Phonemics
@jjanes: ya, hence the need to tweak the ORM code: for the duration of the tests, instead of issueing begin, rollback and commit statements, it needs to do being issuing savepoint tx, rollback to tx and release tx. That way, it's never actually committed. Good point on the multi-connection tests though.Gowon

© 2022 - 2024 — McMap. All rights reserved.