estimating of testing effort as a percentage of development time [closed]
Asked Answered
O

10

35

Does anyone use a rule of thumb basis to estimate the effort required for testing as a percentage of the effort required for development? And if so what percentage do you use?

Ontine answered 20/10, 2009 at 15:12 Comment(0)
F
20

From my experience, 25% effort is spent on Analysis; 50% for Design, Development and Unit Test; remaining 25% for testing. Most projects will fit within a +/-10% variance of this rule of thumb depending on the nature of the project, knowledge of resources, quality of inputs & outputs, etc. One can add a project management overhead within these percentages or as an overhead on top within a 10-15% range.

Fornax answered 20/3, 2013 at 23:55 Comment(0)
B
11

The Google Testing Blog discussed this problem recently:

So a naive answer is that writing test carries a 10% tax. But, we pay taxes in order to get something in return.

(snip)

These benefits translate to real value today as well as tomorrow. I write tests, because the additional benefits I get more than offset the additional cost of 10%. Even if I don't include the long term benefits, the value I get from test today are well worth it. I am faster in developing code with test. How much, well that depends on the complexity of the code. The more complex the thing you are trying to build is (more ifs/loops/dependencies) the greater the benefit of tests are.

Beatty answered 20/10, 2009 at 15:17 Comment(0)
O
8

When you're estimating testing you need to identify the scope of your testing - are we talking unit test, functional, UAT, interface, security, performance stress and volume?

If you're on a waterfall project you probably have some overhead tasks that are fairly constant. Allow time to prepare any planning documents, schedules and reports.

For a functional test phase (I'm a "system tester" so that's my main point of reference) don't forget to include planning! A test case often needs at least as much effort to extract from requirements / specs / user stories as it will take to execute. In addition you need to include some time for defect raising / retesting. For a larger team you'll need to factor in test management - scheduling, reporting, meetings.

Generally my estimates are based on the complexity of the features being delivered rather than a percentage of dev effort. However this does require access to at least a high-level set of instructions. Years of doing testing enables me to work out that a test of a particular complexity will take x hours of effort for preparation and execution. Some tests may require extra effort for data setup. Some tests may involve negotiating with external systems and have a duration far in excess of the effort required.

In the end, though, you need to review it in the context of the overall project. If your estimate is well above that for BA or Development then there may be something wrong with your underlying assumptions.

I know this is an old topic but it's something I'm revisiting at the moment and is of perennial interest to project managers.

Osterman answered 14/3, 2013 at 23:36 Comment(0)
T
4

Some years ago, in a safety critical field, I have heard something like one day for unit testing ten lines of code.

I have also observed 50% of effort for development and 50% for testing (not only unit testing).

Tifanie answered 20/10, 2009 at 15:18 Comment(2)
For safety critical it's also a ratio of 10 lines of test to each line of code.Borg
Does anyone have a reputable reference for this statistic?Ative
C
4

Are you talking about automated unit/integration tests or manual tests?

For the former, my rule of thumb (based on measurements) is 40-50% added to development time i.e. if developing a use case takes 10 days (before an QA and serious bugfixing happens), writing good tests takes another 4 to 5 days - though this should best happen before and during development, not afterwards.

Coolant answered 20/10, 2009 at 15:19 Comment(0)
B
4

When you speak of tests, you could mean waterfall or agile test development. In an agile environment, developers should spend 50% of their time developing and maintaining tests.

But that 50% extra will save you time when the re-factoring and manual verification time comes.

Browning answered 20/10, 2009 at 15:28 Comment(1)
I'm realising that I haven't been very clear. Apologies. I am talking about in the context of preparing a quotation for a client and using a methodology that is more waterfall than agile. Clients want to know what the budget is. We need to give them a realistic figure but at the same time protect ourselves from the gazillion unknowns that there are lying ahead so early in a project. WE tend to agree to a fixed quote for speccing and scoping the project; but only give an indication for iterations/phases that are to follow after that.Ontine
G
3

Testing time is probably more closely correlated to feature scope than development time. I'd also argue (perhaps controversially) that testing time is correlated to the skill of your development team.

For a 6-to-9 month development effort, I demand a absolute minimum of 2 weeks testing time, performed by actual testers (not the development team) who are well-versed in the software they will be testing (i.e., 2 weeks does not include ramp-up time). This is for a project that has ~5 developers.

Ganley answered 20/10, 2009 at 15:19 Comment(0)
P
3

Gartner in Oct 2006 states that testing typically consumes between 10% and 35% of work on a system integration project. I assume that it applies to the waterfall method. This is quite a wide range - but there are many dependencies on the amount of customisations to a standard product and the number of systems to be integrated.

Puissance answered 30/1, 2012 at 9:26 Comment(0)
B
1

The only time I factor in extra time for testing is if I'm unfamiliar with the testing technology I'll be using (e.g. using Selenium tests for the first time). Then I factor in maybe 10-20% for getting up to speed on the tools and getting the test infrastructure in place.

Otherwise testing is just an innate part of development and doesn't warrant an extra estimate. In fact, I'd probably increase the estimate for code done without tests.

EDIT: Note that I'm usually writing code test-first. If I have to come in after the fact and write tests for existing code that's going to slow things down. I don't find that test-first development slows me down at all except for very exploratory (read: throw-away) coding.

Bollay answered 20/10, 2009 at 15:21 Comment(0)
N
1

Judge by yesterday's weather. How long did it take last time? Are you trending longer or shorter? Each shop is different.

Most agile shops need a lot less time, have drastically fewer defects, and quicker time to resolve them because of TDD. Even so, most agile shops have some measurable time spent with testing/QC.

If this is the first test run for this application, then the answer is "lets see" followed by an attempt. It depends on how quick you can get questions answered, - how testable it is, - how many features/functions - how many defects are discovered, - how quickly issues are resolved, - how many times the code cycles through testing, and - how many times testing is blocked by bugs. There is no way to tell. You could call it 50% or 175% or more, and not be wrong. Why not make a rough guess and multiply by Pi? It won't be much worse than any other answer you can make up.

You should (must) know how long it takes now and whether it's getting faster or slower, and whether the coverage is increasing or decreasing. With those three bits of information, you should be able to guess quite well.

Neelon answered 20/10, 2009 at 23:8 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.