Django unit-testing with loading fixtures for several dependent applications problems
Asked Answered
C

2

1

I'm now making unit-tests for already existing code. I faced the next problem:

After running syncdb for creating test database, Django automatically fills several tables like django_content_type or auth_permissions.

Then, imagine I need to run a complex test, like check the users registration, that will need a lof ot data tables and connections between them.

If I'll try to use my whole existing database for making fixtures (that would be rather convinient for me) - I will receive the error like here. This happens because, Django has already filled tables like django_content_type.

The next possible way is to use django dumpdata --exclude option for already filled with syncdb tables. But this doesn't work well also, because if I take User and User Group objects from my db and User Permissions table, that was automatically created by syncdb, I can receive errors, because the primary keys, connecting them are now pointing wrong. This is better described here in part 'fixture hell', but the solution shown there doensn't look good)

The next possible scheme I see is next:

  1. I'm running my tests; Django creates test database, makes syncdb and creates all those tables.
  2. In my test setup I'm dropping this database, creating the new blank database.
  3. Load data dump from existing database also in test setup
Colo answered 24/3, 2010 at 11:44 Comment(2)
You can use dumpdata --natural --exclude <django-core-apps>. The --natural flag tells django to use natural keys which is a replacement for primary id. For example, all foreign keys to ContentType model will be dumped like: ["<app_label>", "<model_name>"], instead of: <content-type-id>.Fatten
This helped me a lot, please provide it as an answer.Bacchus
C
0

That's how the problem was solved:

After the syncdb has created the test database, in setUp part of the tests I use os.system to access shell from my code. Then I'm just loading the dump of the database, which I want to use for tests.

So this works like this: syncdb fills contenttype and some other tables with data. Then in setUp part of tests loading the sql dump clears all the previously created data and i get a nice database.

May be not the best solution, but it works=)

Colo answered 26/3, 2010 at 17:33 Comment(2)
In setUp I'm calling something like: os.system('mysql -u root -proot test_database < test_db.sql') So I'm loading mine own test database, which I've previously have filled with all the needed test data.Colo
This makes the test completely tied to your particular system. It's a nice workaround, but not a generic solution.Anastasius
S
0

My approach would be to first use South to make DB migrations easy (which doesn't help at all, but is nice), and then use a module of model creation methods.

When you run

  $  manage.py test my_proj

Django with South installed with create the Test DB, and run all your migrations to give you a completely updated test db.

To write tests, first create a python module calle, test_model_factory.py In here create functions that create your objects.

def mk_user():
   User.objects.create(...)

Then in your tests you can import your test_model_factory module, and create objects for each test.

  def test_something(self):
     test_user = test_model_factory.mk_user()

     self.assert(test_user ...)
Sargeant answered 22/7, 2010 at 23:2 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.