We are using Django 1.4 with PostgreSQL on Ubuntu 12.04. We have many tests, and the problem is that running the tests is very slow, I think because for each test the database is created from scratch. I want to make the tests faster by running them with the database in memory (not hard disk). How do I do it? Do you have any links or tutorials?
In Django 1.9 if you have a multi-core processor a great option is the flag:
--parallel
This requires you to pip install tblib
but will let you run your unit tests simultaneously on multiple cores. (https://docs.djangoproject.com/en/1.10/ref/django-admin/#cmdoption-test-parallel)
Another great option for Django 1.8+ is the flag:
--keepdb
It reuses your test database, stopping the long wait time caused by creating a new test database each time you run tests. (https://docs.djangoproject.com/en/1.10/ref/django-admin/#cmdoption-test-keepdb
--keep-db
makes the tests faster, it should only be used locally and actual tests (e.g. the ones you run in CICD) should not use it. It could prevent you from discovering some issues, especially models and migrations related. –
Enlace The best option is to have a separate settings file for your tests. In settings_test.py you tell it to use sqlite which by default use an in-memory database:
from base_settings import *
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.sqlite3',
'NAME': ':memory'
}
}
And then run your tests by adding --settings=settings_test
See also the Django docs:
https://docs.djangoproject.com/en/dev/topics/testing/overview/#the-test-database
--keepdb
in a persistent Postgres instance is often just as fast or faster than transient, in-memory SQLite. –
Sallet There are a couple of SO threads that are helpful:
I definitely do use the SQLite trick to do sanity checks, but if you're doing anything database-specific it will drive you nuts: certain SQL differences, differences is data precision, etc. It also undercuts the point of testing: if you're using the tests to reassure you the change will work once pushed to production, running them against a different database isn't a good way to do that. Try using nose to skip database recreation when possible and optimize your local Postgres setup. You can try avoiding the DB altogether as well.
The thing that works best for me is trying to see the downtime caused by testing as an opportunity to come up with better changes and a way to encourage me to think about what I'm changing before firing up the test runner.
Fast forward to 2016 and we have a very nice option in manage.py to speed up tests.
--keepdb, -k¶ New in Django 1.8. Preserves the test database between test runs. This has the advantage of skipping both the create and destroy actions which can greatly decrease the time to run tests, especially those in a large test suite. If the test database does not exist, it will be created on the first run and then preserved for each subsequent run. Any unapplied migrations will also be applied to the test database before running the test suite.
If you are not using TransactionTestCase and it's subclasses a large portion of the test running time will come from database creation. If you have a large number of migrations it's going to be really bad. but you and avoid all that with
./manage.py test -k myapp
You can simply change the database for tests to sqlite:
import sys
if 'test' in sys.argv:
DATABASES['default']['engine'] = 'sqlite3'
Note, that some of your tests can fail due to some incompatibilities between databases, but generally this should work.
Not familiar with python or Django, but conceptually, you should be able to:
- Create your database and load your fixtures once when you bootstrap your tests
- Define a savepoint.
- After each test or group of test, rollback to that savepoint.
(You might need to override the ORM's begin/end transaction code for the duration of the tests if it doesn't support savepoints.)
http://www.postgresql.org/docs/current/static/sql-savepoint.html
(I'd add that at a conceptual level, your DBAL and ORM should get mocked in your tests, so that you're testing your component in isolation. Which is to say, you probably shouldn't be connecting to the database to begin with in most of your tests.)
begin
, rollback
and commit
statements, it needs to do being issuing savepoint tx
, rollback to tx
and release tx
. That way, it's never actually committed. Good point on the multi-connection tests though. –
Gowon © 2022 - 2024 — McMap. All rights reserved.