I just ran into some unexpected behavior with DateTime.UtcNow while doing some unit tests. It appears that when you call DateTime.Now/UtcNow in rapid succession, it seems to give you back the same value for a longer-than-expected interval of time, rather than capturing more precise millisecond increments.
I know there is a Stopwatch class that would be better suited for doing precise time measurements, but I was curious if someone could explain this behavior in DateTime? Is there an official precision documented for DateTime.Now (for example, precise to within 50 ms?)? Why would DateTime.Now be made less precise than what most CPU clocks could handle? Maybe it's just designed for the lowest common denominator CPU?
public static void Main(string[] args)
{
var stopwatch = new Stopwatch();
stopwatch.Start();
for (int i=0; i<1000; i++)
{
var now = DateTime.Now;
Console.WriteLine(string.Format(
"Ticks: {0}\tMilliseconds: {1}", now.Ticks, now.Millisecond));
}
stopwatch.Stop();
Console.WriteLine("Stopwatch.ElapsedMilliseconds: {0}",
stopwatch.ElapsedMilliseconds);
Console.ReadLine();
}
Console.WriteLine(String.Format("Ticks: {0} Milliseconds: {1} Stopwatch: {2}", now.Ticks, now.Millisecond, stopwatch.ElapsedMilliseconds))
than the stopwatch value changes with the exact same interval as the DateTime value. Maybe it's theConsole.WriteLine
who's doing a not so good job instead of the DateTime. – Valise