Should I suppress CA1062: Validate arguments of public methods?
Asked Answered
D

3

7

I've recently upgraded my project to Visual Studio 2010 from Visual Studio 2008.

In Visual Studio 2008, this Code Analysis rule doesn't exist.

Now I'm not sure if I should use this rule or not.

I'm building an open source library so it seems important to keep people safe from doing mistakes. However, if all I'm going to do is throw ArgumentNullException when the parameter is null, it seems like writing useless code since ArgumentNullException will be thrown even if I won't write that code.

EDIT: Also, there is a performance issue that needs to be addressed. Checking for null in every public method can cause performance issues.

Should I remove that rule or fix the violations?

Dissidence answered 18/5, 2010 at 11:17 Comment(0)
O
6

That depends. The convention when using ArgumentNullException is to include the name of the null argument in the description. So the caller will know exactly what went wrong.

The source of a NullReferenceException (which is what will happen if you don't validate) may be easy to spot, but if the method is complex it may be more difficult. You may end up at a line of code where multiple references could be null.

Personally I prefer to have public methods throw ArgumentNullException if they cannot handle null input for the given argument, because it allows for a consistent behavior no matter how the method is implemented.

In reply to edit: In my opinion it is more important to provide a consistent set of interfaces with few surprises than optimizing every line of code. My guess is that in most cases the performance overhead of the null check will not be significant. If it does turn out to be a problem (as in profiling says so) I would consider changing it. Otherwise I wouldn't consider it a problem at this point.

Odrick answered 18/5, 2010 at 11:27 Comment(1)
Well said on the perf issues. The null check is cheap, and it usually won't find null.Disaccustom
D
3

We keep this rule on because we decided that it was always best to check parameters on their first entry into your code. That way, when the person using your code gets the exception the context in it will be better. Someone who does not have your source code for example will see that the exception was thrown in your code and may file a bug report against you for it wasting your time and theirs.

Disaccustom answered 18/5, 2010 at 11:25 Comment(0)
S
2

IMHO No. Checking for nulls is almost never going to be the performance bottleneck in your application. (And in the one in a million case where it is significant, you'll find it with your profiler and remove that one case).

The other question that should form in your mind is "is throw new NullReferenceException() really the best way to handle the error?" Often you can handle things better than that (even if only to provide a better error report to the user and/or yourself for debugging purposes). In many cases code can handle nulls gracefully, making it unnecessary for this to be an error at all.

edit

To answer your edit: Null checks really don't take long. The overhead for simply calling a method will be tens if not hundreds of times more than a null check. The only place where a null check will make any significant difference is in a large, tight loop where you are doing very little else. This situation doesn't happen very often - usually you will check for a null and then do something relatively expensive with that reference.

There is no situation where a crash or failure is a good thing. It is always better to "slow your application down" with null checks than to crash and lose your customer's data.

So don't prematurely optimise your code. Write it well, to be maintainable and robust, then profile it to see where the bottlenecks are. I've been programming for 28 years, being very liberal with null checks, and have never found that a null check was the cause of a performance problem. Usually it's things like doing lots of unnecessary work in a loop, using an O(n^3) algorithm where an O(n^2) approach is possible, failing to cache expensive-to-compute values, etc.

Span answered 18/5, 2010 at 19:19 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.