What is the best way to reduce cyclomatic complexity when validating data? [closed]
Asked Answered
I

5

7

Right now I'm working on a web application that receives a significant amount of data from a database that has a potential to return null results. When going through the cyclomatic complexity for the application a number of functions are weighing in between 10 - 30. For the most part the majority of the functions with the high numbers have a lot of lines similar to the following:

If Not oraData.IsDBNull(4) Then row("Field") = oraData.GetString(4)

Which leads me to my question, what is the best way to go about trying to bring these numbers down? Right now I'm looking at having the majority of the functions below 10.

Illomened answered 15/10, 2008 at 13:41 Comment(0)
B
2

What about using Extension Methods.

Imports System.Runtime.CompilerServices

Module Extensions

    <Extension()> _
    Public Function TryGetString(ByVal row As IDataRecord, i As Integer) As String
        If row.IsDBNull(i) Then
            Return null
        End If
        Return row.GetString(i);
    End Function

End Module

Then you can simply write:

row("Field") = oraData.TryGetString(4)

This reads smoothly and reduces cyclomatic complexity on your functions.

Befuddle answered 15/10, 2008 at 14:12 Comment(0)
F
2

The first question is: Why are you "hung" up on CC ? It's a tool to evaluate how dense the code is and a rule of thumb should be "not too high of a cc number".

It's probably hitting all those "IF"s and bringing up that number - so reduce the number of ifs by calling a wrap function that extracts the data from the result set which handles the null or change the query so it doesn't return nulls.

Keep in mind that nulls do provide information and are not useless. For example Republican or Democrat ? using null says neither choice.

Fowler answered 15/10, 2008 at 13:49 Comment(3)
More than anything else, we are trying to make sure the application is maintainable in the long run. This is a high profile application that will likely be around for awhile so making sure it is maintainable is important. Also, we are trying to focus the work only on the "critical" functions.Illomened
[Continued] So right now anything that is >= 20 is getting a careful look but the problem is that a lot of them are data wrapper functions that also do data processing. So we are trying to figure out how to drive the numbers down as they are also the most likely to be modified in the future.Illomened
Understood and I've read the comments on the other answers and I think that the way it's presented (just a null check) would be a perfectly good exception in this case. High CC nbrs on funcs that are heavy on nested IFs are "bad" and should be refactored. I have a Case function thats the same way.Fowler
F
2

Decompose into functions, perhaps something like this:

//Object Pascal
procedure UpdateIfNotNull( const fldName: String; fldIndex : integer );
begin
  if oraData.IsDBNull( fldIndex ) then
    row( fldName ) := oraData.GetString(fldIndex);
end;

Of course you can extend the procedures signature so that "oraData" and "row" can passed as parameters.

Foreordination answered 15/10, 2008 at 13:53 Comment(0)
B
2

What about using Extension Methods.

Imports System.Runtime.CompilerServices

Module Extensions

    <Extension()> _
    Public Function TryGetString(ByVal row As IDataRecord, i As Integer) As String
        If row.IsDBNull(i) Then
            Return null
        End If
        Return row.GetString(i);
    End Function

End Module

Then you can simply write:

row("Field") = oraData.TryGetString(4)

This reads smoothly and reduces cyclomatic complexity on your functions.

Befuddle answered 15/10, 2008 at 14:12 Comment(0)
I
1

Did you see this question? He's asking something similar (but I think at a more basic level) ... but then that means the answers there may not be much help here.

I would definitely agree with the other suggestions here: if there are repeated statements that can be neatly packaged into functions/procedures, that might be one approach to take, as long as you're not just shifting CC around. I'm not sure you've gained too much if you move from one proc with a CC of 35 to three procs with CCs of 15, 10, and 10. (It's not a bad first step, but ideally you'd be able to simplify something on a larger scope to reduce total CC in that area of your system.)

Incredulous answered 15/10, 2008 at 14:28 Comment(0)
D
0

It is possible to refactor the if into a separate utility function to reduce your CC. A number of functions or a function relying on type differentiation might be required to handle the different database types ( string, int etc ..)

However, I would argue that any solution would result in less maintainable or readable code (i.e. you might worsen other metrics!) and would as QA allow it to pass according to this justification.

Drews answered 15/10, 2008 at 13:56 Comment(1)
For the most part I'm against using a bunch of functions for the same reasons you are. Depending upon the feedback I get I might just flag the functions to be double checked to ensure that the complexity is due to the database and then just make sure the are well tested.Illomened

© 2022 - 2024 — McMap. All rights reserved.