The other answers are correct that you can average TimeSpan
properties with simple LINQ. I prefer letting the TimeSpan
class do it's work to determine precision, such as Uffe showed. But make sure you are using the correct property. .Seconds
will return the remainder of seconds that don't fit into a full minute. You want the .TotalSeconds
property.
Using .Ticks
might be tempting, but you could easily get an OverflowException
if you have a lot of values and they are spaced far enough apart from each other. My recommendation is to use one unit smaller than you need in your result. So if you care about average precision to the minute, then .TotalSeconds
should work well. You can then take the result back into a TimeSpan
if you like.
var sec = yourList.Select(p => p.Stop - p.Start).Average(p => p.TotalSeconds);
var avg = TimeSpan.FromSeconds(sec);
Also, since your Start
and Stop
values are of type DateTime
, then you really need to pay attention to their .Kind
. An operation like this is only guaranteed to be accurate if yourDateTime.Kind == DateTimeKind.Utc
.
If they are Local
kinds, then the result will be influenced by the time zone of the local computer that is running the code. If they are Unspecified
, then the data may have been recorded in the context of the local time zone, or of some other time zone. If the times you are working in cover a daylight saving time transition, then your result of subtraction may be incorrect.
For example, if you are running this code in the US Pacific Time zone, and you have Local
kinds of DateTime
, then subtracting 2013-11-03 06:00
- 2013-11-03 00:00
would give you a TimeSpan
of 6 hours. But in reality, 7 hours will have elapsed, since that is the day DST ends and the clocks repeat the hour between 1:00 and 2:00.
To avoid this problem, you should only do math with DateTime
values that are in Utc
, or you should use DateTimeOffset
values instead.