Should I write Unit-Tests for CRUD operations when I have already Integration-Tests?
Asked Answered
D

3

9

In our recent project Sonar was complaining about a weak test coverage. We noticed that it didn't consider integration tests by default. Beside the fact that you can configure Sonar, so it will consider them (JaCoCo plugin), we were discussing the question in our team if there really is the need to write Unit Tests, when you cover all your service and database layer with integration tests anyway.

What I mean with integration tests is, that all our tests run against a dedicated Oracle instance of the same type we use in production. We don't mock anything. If a service depends on another service, we use the real service. Data we need before running a test, we construct through some factory classes that use our Services/Repositories (DAOs).

So from my point of view - writing integration tests for simple CRUD operations especially when using frameworks like Spring Data/Hibernate is not a big effort. It is sometimes even easier, because you don't think of what and how to mock.

So why should I write Unit Tests for my CRUD operations that are less reliable as the Integration Tests I can write?

The only point I see is that integration tests will take more time to run, the bigger the project gets. So you don't want to run them all before check-in. But I am not so sure if this is so bad, if you have a CI environment with Jenkins/Hudson that will do the job.

So - any opinions or suggestions are highly appreciated!

Diaphaneity answered 17/9, 2012 at 22:30 Comment(0)
I
11

If most of your services simply pass through to your daos, and your daos do little but invoke methods on Spring's HibernateTemplate or JdbcTemplate then you are correct that unit tests don't really prove anything that your integration tests already prove. However, having unit tests in place are valuable for all the usual reasons.

Since unit tests only test single classes, run in memory with no disk or network access, and never really test multiple classes working together, they normally go like this:

  • Service unit tests mock the daos.
  • Dao unit tests mock the database driver (or spring template) or use an embedded database (super easy in Spring 3).

To unit test the service that just passes through to the dao, you can mock like so:

@Before
public void setUp() {
    service = new EventServiceImpl();
    dao = mock(EventDao.class);
    service.EventDao = dao;
}

@Test
public void creationDelegatesToDao() {
    service.createEvent(sampleEvent);
    verify(dao).createEvent(sampleEvent);
}

@Test(expected=EventExistsException.class)
public void creationPropagatesExistExceptions() {
    doThrow(new EventExistsException()).when(dao).createEvent(sampleEvent);
    service.createEvent(sampleEvent);
}

@Test
public void updatesDelegateToDao() {
    service.updateEvent(sampleEvent);
    verify(dao).updateEvent(sampleEvent);
}

@Test
public void findingDelgatesToDao() {
    when(dao.findEventById(7)).thenReturn(sampleEvent);
    assertThat(service.findEventById(7), equalTo(sampleEvent));

    service.findEvents("Alice", 1, 5);
    verify(dao).findEventsByName("Alice", 1, 5);

    service.findEvents(null, 10, 50);
    verify(dao).findAllEvents(10, 50);
}

@Test
public void deletionDelegatesToDao() {
    service.deleteEvent(sampleEvent);
    verify(dao).deleteEvent(sampleEvent);
}

But is this really a good idea? These Mockito assertions are asserting that a dao method got called, not that it did what was expected! You will get your coverage numbers but you are more or less binding your tests to an implementation of the dao. Ouch.

Now this example assumed the service had no real business logic. Normally the services will have business logic in addtion to dao calls, and you surely must test those.

Now, for unit testing daos, I like to use an embedded database.

private EmbeddedDatabase database;
private EventDaoJdbcImpl eventDao = new EventDaoJdbcImpl();

@Before
public void setUp() {
    database = new EmbeddedDatabaseBuilder()
            .setType(EmbeddedDatabaseType.H2)
            .addScript("schema.sql")
            .addScript("init.sql")
            .build();
    eventDao.jdbcTemplate = new JdbcTemplate(database);
}

@Test
public void creatingIncrementsSize() {
    Event e = new Event(9, "Company Softball Game");

    int initialCount = eventDao.findNumberOfEvents();
    eventDao.createEvent(e);
    assertThat(eventDao.findNumberOfEvents(), is(initialCount + 1));
}

@Test
public void deletingDecrementsSize() {
    Event e = new Event(1, "Poker Night");

    int initialCount = eventDao.findNumberOfEvents();
    eventDao.deleteEvent(e);
    assertThat(eventDao.findNumberOfEvents(), is(initialCount - 1));
}

@Test
public void createdEventCanBeFound() {
    eventDao.createEvent(new Event(9, "Company Softball Game"));
    Event e = eventDao.findEventById(9);
    assertThat(e.getId(), is(9));
    assertThat(e.getName(), is("Company Softball Game"));
}

@Test
public void updatesToCreatedEventCanBeRead() {
    eventDao.createEvent(new Event(9, "Company Softball Game"));
    Event e = eventDao.findEventById(9);
    e.setName("Cricket Game");
    eventDao.updateEvent(e);
    e = eventDao.findEventById(9);
    assertThat(e.getId(), is(9));
    assertThat(e.getName(), is("Cricket Game"));
}

@Test(expected=EventExistsException.class)
public void creatingDuplicateEventThrowsException() {
    eventDao.createEvent(new Event(1, "Id1WasAlreadyUsed"));
}

@Test(expected=NoSuchEventException.class)
public void updatingNonExistentEventThrowsException() {
    eventDao.updateEvent(new Event(1000, "Unknown"));
}

@Test(expected=NoSuchEventException.class)
public void deletingNonExistentEventThrowsException() {
    eventDao.deleteEvent(new Event(1000, "Unknown"));
}

@Test(expected=NoSuchEventException.class)
public void findingNonExistentEventThrowsException() {
    eventDao.findEventById(1000);
}

@Test
public void countOfInitialDataSetIsAsExpected() {
    assertThat(eventDao.findNumberOfEvents(), is(8));
}

I still call this a unit test even though most people might call it an integration test. The embedded database resides in memory, and it is brought up and taken down when the tests run. But this relies on the fact that the embedded database looks the same as the production database. Will that be the case? If not, then all that work was pretty useless. If so, then, as you say, these tests are doing anything different than the integration tests. But I can run them on demand with mvn test and I have the confidence to refactor.

Therefor, I write these unit tests anyway and meet my coverage targets. When I write integration tests, I assert that an HTTP request returns the expected HTTP response. Yeah it subsumes the unit tests, but hey, when you practice TDD you have those unit tests written before your actual dao implementation anyway.

If you write unit tests after your dao, then of course they are no fun to write. The TDD literature is full of warnings about how writing tests after your code feels like make work and no one wants to do it.

TL;DR: Your integration tests will subsume your unit tests and in that sense the unit tests are not adding real testing value. However when you have a high-coverage unit test suite you have the confidence to refactor. But of course if the dao is trivially calling Spring's data access template, then you might not be refactoring. But you never know. And finally, though, if the unit tests are written first in TDD style, you are going to have them anyway.

Icsh answered 18/9, 2012 at 0:39 Comment(2)
Ray, thanks a lot four your detailed explanation and examples! I guess we will stick to the approach writing integration tests first. I would even go so far to say that unit tests for CRUD services of a web application are nice-to-have, but what really counts are integration-tests. and why not doing it the TDD way, but write integration tests instead of unit tests before implementing the service?Diaphaneity
That will be fine; the unit test police won't knock at your door. It is true that the integration tests will cover the unit tests. Personally I do enjoy the feeling of having the unit tests there (confidence in refactoring, a feeling of completeness, and all that) but that's just me. If you are happy with gaining your coverage through integration tests as opposed to unit tests, and your integration tests are fast running developer-generated integration tests as opposed to QA-generated functional-system-integration-acceptance-production tests, then cool. :)Icsh
O
1

You only really need to unit test each layer in isolation if you plan to have the layers exposed to other components out of your project. For a web app, the only way the the repository layer can be invoked, is by the services layer, and the only way the service layer can be invoked is by the controller layer. So testing can start and end at the controller layer. For background tasks, these are invoked in the service layer, so need to be tested here.

Testing with a real database is pretty fast these days, so doesn't slow your tests down too much, if you design your setup/tear down well. However, if there are any other dependancies that could be slow, or problematic, then these should be mocked/stubbed.

This approach will give you:

  • good coverage
  • realistics tests
  • minimum amount of effort
  • minimum amount of refectoring effort

However, testing layers in isolation does allow your team to work more concurrently, so one dev can do repository and another can do service for one piece of functionality, and produce independently tested work.

There will always be double coverage when selenium/functional tests are incorporated as you can't rely on these alone as they are too slow to run. However, functional tests dont necessarily need to cover all the code, core functionality alone can be sufficient, aslong as the code has been covered by unit/integration tests.

Okapi answered 18/9, 2012 at 1:9 Comment(0)
R
0

I think there are two advantages of having finer grained(I will not use intentionaly the word unit test here) tests besides the high end integration tests.

1) Redundancy, having the layers covered in more than one place acts like a switch. If one set of tests (the integration test f.ex.) fail to locate the error the second layer may catch it. I will draw a comparison here with electric switches where redundancy is a must. You have a main switch and a specialized switch.

2)Lets suppose that you have a process calling external service. For one or another reason (Bug) the original exception gets consumer and an exception that does not carry information about the technical nature of the error reaches the integration test. The integration test will catch the error, but you will have no clue about what the error is or where is it coming from. Having a finer grained test in place increases the chance of pointing in the correct direction what and where excactly has failed.

I personally think that certain level of redundancy in testing is not a bad thing.

In your particular case if you write a CRUD test with in memory database you will have the chance to test your Hibernate mapping layer which can be quite complex if you are using things like Cascading , or fetching and so on...

Rand answered 27/11, 2019 at 19:28 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.