I want to check if a reference type is null. I see two options (_settings is of reference type FooType):
if (_settings == default(FooType)) { ... }
and
if (_settings == null) { ... }
How do these two perform differently?
I want to check if a reference type is null. I see two options (_settings is of reference type FooType):
if (_settings == default(FooType)) { ... }
and
if (_settings == null) { ... }
How do these two perform differently?
There's no difference. The default value of any reference type is null
.
MSDN's C# reference page for default
keyword: https://msdn.microsoft.com/en-us/library/25tdedf5.aspx.
null
values have a type. Is that really true? –
Cleaning Now that we don't need to pass the type to default anymore, default is preferred.
It is just as readable
It can be used both for value and reference types
It can be used in generics
if (_settings == default) { ... }
Also, after calling
obj = enumerable.FirstOrDefault();
it makes more sense to test for default after that and not for null. Otherwise it should have been FirstOrNull, but value dont have a null value but do have a default.
There is no difference, but second one is more readable. The best place to use default
is when you deal with generics. Common code is return default(T);
DateTime
it returns "0001-01-01T00:00:00" which is a Valid value... unlike null
which is Invalid value. –
Kob DateTime
is not a reference type. –
Lamson default
(without parentheses) all over the place because I instinctively know what the default is for the types that I work with. It can mean null
or DateTime.Min or 0
for my int
. It would be more readable for people that don't automatically register the default value in their head. It can also, for example, help you do a search for all values that are assigned a default value. Instead of searching for null
, 0
, or Decimal.Zero
, you just search for default
, == default
, or even is default
–
Eberto Not different but I think
if (_settings == null) { ... }
is clearer.
My understanding is they are not different. It only matters when you are dealing with value types.
I would definitely go with the specific check against null. Because if the type of the _settings
class ever changes you may run into reference issues. At minimum it would require a change to the code breaking the open/close policy.
if( _settings == null ) {...}
This IMO is safer and cleaner.
As has been mentioned, there is no difference... but you might want to use default(<type>)
anyway, to handle the cases where it's not a reference type. Typically this is only in generics, but it's a good habit to form for the general case.
© 2022 - 2024 — McMap. All rights reserved.
default(T)
, it's more readable. – Hyperaesthesia