This is because .NET defaults to 'ToEven' rounding, while SQL uses 'AwayFromZero'. See this. These are different rounding methods, and they differ in how they treat 5.
AwayFromZero rounds it up to the next positive, or down to the next negative number. So, 0.5 becomes 1, -0.5 becomes -1. ToEven rounds to the nearest even number. So 2.5 becomes 2, 3.5 becomes 4 (and likewise for negative numbers). Numbers other than 5 are treated the same, and they are rounded to the nearest number. Since 5 is equidistant from two numbers, it's a special case, with different strategies.
ToEven is also known as 'Banking Rules', and it's the default used in IEEE_754, which is why it's the default in .NET.
Conversely, AwayFromZero is also known as 'Commercial Rounding'. I don't know why it is the default of SQL Server, probably simply because it's the most widely known and understood method.
Of course, you can always configure what you need:
In C# you can do:
Math.Round(value, MidpointRounding.ToEven)
or
Math.Round(value, MidpointRounding.AwayFromZero)
In SQL you can use ROUND(), FLOOR() and/or CEILING().
Which of the methods is better, depends what you use it for, and what you want. For reasonable collections/distributions, the average of rounded toEven values is the same as its original values. This is not necessarily the case with AwayFromZero. If you have a collection with many .5
data, rounding AwayFromZero will treat all those values the same, and introduce a bias.
The effect is that the average of the rounded values is not the same as the average of the original values. The point of rounding is making a value simpler, while it holds the same meaning. This is no longer the case if the averages don't match; the rounded values have a (slightly?) different meaning then the original values.