Windows 7 timing functions - How to use GetSystemTimeAdjustment correctly?
Asked Answered
A

3

11

I ran some tests using the GetSystemTimeAdjustment function on Windows 7, and got some interesting results which I cannot explain. As fas as I understand, this method should return if the system time is synchronized periodically and if it is, at which interval and with which increment it is updated (see GetSystemTimeAdjustment function on MSDN).

From this I follow that if I query the system time for example using GetSystemTimeAsFileTime repeatingly I should either get no change (the system clock has not been updated), or a change which is a multiple of the increment retrieved by GetSystemTimeAdjustment. Question one: Is this assumption correct?

Now consider the following testing code:

#include <windows.h>
#include <iostream>
#include <iomanip>

int main()
{
    FILETIME fileStart;
    GetSystemTimeAsFileTime(&fileStart);
    ULARGE_INTEGER start;
    start.HighPart = fileStart.dwHighDateTime;
    start.LowPart = fileStart.dwLowDateTime;

    for (int i=20; i>0; --i)
    {
        FILETIME timeStamp1;
        ULARGE_INTEGER ts1;

        GetSystemTimeAsFileTime(&timeStamp1);

        ts1.HighPart = timeStamp1.dwHighDateTime;
        ts1.LowPart  = timeStamp1.dwLowDateTime;

        std::cout << "Timestamp: " << std::setprecision(20) << (double)(ts1.QuadPart - start.QuadPart) / 10000000 << std::endl;

    }

    DWORD dwTimeAdjustment = 0, dwTimeIncrement = 0, dwClockTick;
    BOOL fAdjustmentDisabled = TRUE;
    GetSystemTimeAdjustment(&dwTimeAdjustment, &dwTimeIncrement, &fAdjustmentDisabled);

    std::cout << "\nTime Adjustment disabled: " << fAdjustmentDisabled
        << "\nTime Adjustment: " << (double)dwTimeAdjustment/10000000
        << "\nTime Increment: " << (double)dwTimeIncrement/10000000 << std::endl;

}

It takes 20 timestamps in a loop and prints them to the console. In the end it prints the increment with which the system clock is updated. I would expect the differences between the timestamps printed in the loop to be either 0 or multiples of this increment. However, I get results like this:

Timestamp: 0
Timestamp: 0.0025000000000000001
Timestamp: 0.0074999999999999997
Timestamp: 0.01
Timestamp: 0.012500000000000001
Timestamp: 0.014999999999999999
Timestamp: 0.017500000000000002
Timestamp: 0.022499999999999999
Timestamp: 0.025000000000000001
Timestamp: 0.0275
Timestamp: 0.029999999999999999
Timestamp: 0.032500000000000001
Timestamp: 0.035000000000000003
Timestamp: 0.040000000000000001
Timestamp: 0.042500000000000003
Timestamp: 0.044999999999999998
Timestamp: 0.050000000000000003
Timestamp: 0.052499999999999998
Timestamp: 0.055
Timestamp: 0.057500000000000002

Time Adjustment disabled: 0
Time Adjustment: 0.0156001
Time Increment: 0.0156001

So it appears that the system time is updated using an interval of about 0.0025 seconds and not 0.0156 seconds as return by GetSystemTimeAdjustment.

Question two: What is the reason for this?

Acaroid answered 7/10, 2011 at 10:5 Comment(0)
D
30

The GetSystemTimeAsFileTimeAPI provides access to the system's wall clock in file time format.

A 64-bit FILETIME structure receives the system time as FILETIME in 100ns units, which have been expired since Jan 1, 1601. The call to GetSystemTimeAsFileTime typically requires 10 ns to 15 ns.

In order to investigate the real accuracy of the system time provided by this API, the granularity that comes along with the time values needs to be discussed. In other words: How often is the system time updated? A first estimate is provided by the hidden API call:

NTSTATUS NtQueryTimerResolution(OUT PULONG MinimumResolution, 
                                OUT PULONG MaximumResolution, 
                                OUT PULONG ActualResolution);

NtQueryTimerResolution is exported by the native Windows NT library NTDLL.DLL. The ActualResolution reported by this call represents the update period of the system time in 100 ns units, which does not necessarily match the interrupt period. The value depends on the hardware platform. Common hardware platforms report 156,250 or 100,144 for ActualResolution; older platforms may report even larger numbers; newer systems, particulary when HPET (High Precision Event Timer) or constant/invariant TSC are supported, may return 156,001 for ActualResolution.

This is one of the heartbeats controlling the system. The MinimumResolution and the ActualResolution are relevant for the multimedia timer configuration.

The ActualResolution can be set by using the API call

NTSTATUS NtSetTimerResolution(IN ULONG RequestedResolution,
                              IN BOOLEAN Set,
                              OUT PULONG ActualResolution);

or via the multimedia timer interface

MMRESULT timeBeginPeriod(UINT uPeriod);

with the value of uPeriod derived from the range allowed by

MMRESULT timeGetDevCaps(LPTIMECAPS ptc, UINT cbtc );

which fills the structure

typedef struct {
  UINT wPeriodMin;
  UINT wPeriodMax;
} TIMECAPS;

Typical values are 1 ms for wPeriodMin and 1,000,000 ms for wPeriodMax.

There is an unfortunate misinterpretation when looking an the min/max values here:

  • wPeriodMin defines the minimum period, which is clear in this context.
  • MinimumResolution returned by NtQueryTimerResolution on the other hand specifies a resolution. The lowest obtainable resolution (MinimumResolution) is in the range of up to about 20 ms, while the highest obtainable resolution (MaximumResolution) can be 0.5 ms. However, the 0.5 ms resulution is not accessable through a of timeBeginPeriod.

The multimedia timer interface handles periods and NtQueryTimerResolution() handles resolutions (reciprocal value of period).

Summary: GetSystemTimeAdjustment is not the function to look at. This function only tells how and if time-changes are done. Depending on the setting of the multimedia timer interface timeBeginPeriod, the progress of time may be done more often and in smaller portions. Use NtQueryTimerResolution to receive the actual time increment. And be aware that the setting of the multimedia timer API does influence the values. (Example: When the media player is showing a video, the times are getting short.)

I diagnosed windows time matters to a large extent. Some of the results can be found here.

Note: Time Adjustment: 0.0156001 clearly identifies windows VISTA or higher with HPET and/or constant/invariant TSC on your system.

Implementation: If you want to catch the time transition:

FILETIME FileTime,LastFileTime;
long long DueTime,LastTime;
long FileTimeTransitionPeriod; 

GetSystemTimeAsFileTime(&FileTime);
for (int i = 0; i < 20; i++) {
  LastFileTime.dwLowDateTime = FileTime.dwLowDateTime;
  while (FileTime.dwLowDateTime == LastFileTime.dwLowDateTime) GetSystemTimeAsFileTime(&FileTime); 
  // enough to just look at the low part to catch the transition
  CopyMemory(&DueTime,&FileTime,sizeof(FILETIME));
  CopyMemory(&LastTime,&LastFileTime,sizeof(FILETIME));
  FileTimeTransitionPeriod = (long)(DueTime-LastTime);
  fprintf(stdout,"transition period: % 7.4lf ms)\n",(double)(FileTimeTransitionPeriod)/10000);
}   

// WARNING: This code consumes 100% of the cpu for 20 file time increments.
// At the standard file time increment of 15.625 ms this corresponds to 312.5ms!

But: When the filetime transition is very short (e.g. set by timeBeginPeriod(wPeriodMin)) any output like fprintf or std::cout might destroy the result because it delays the loop. In such cases I'd recommend to store the 20 results in a data structure and do the output afterwards.

And: The filetime transition may not always be the same. It may well be that the file time increment does not match the update period. See the link above to get more details and examples for this bahavior.

Edit: Use caution when calling timeBeginPeriod, as frequent calls can significantly affect the system clock MSDN. This behavior applies up to Windows version 7.

Calls to timeBeginPeriod/timeEndPeriod or NtSetTimerResolution may change the system time by as much as ActualResolution. Doing it very often results in considerable changes of the system time. However, when the calls are made at or near the transition of the system time, deviations are much less. Polling for a system time transition/increment ahead of calls to the above function is advised for demanding applications like NTP clients. Synchronizing to an NTP server is difficult when unwanted jumps in the systemtime progess occurs.

Drafty answered 31/7, 2012 at 15:27 Comment(2)
here is a little tool and C# code to use the described method: github.com/tebjan/TimerToolMonohydroxy
Second parameter for NtQueryTimerResolution must be OUT PULONG MaximumResolution, not OUT LONGMaximumResolution. Also add spaces between parameter types (PULONG) and parameter names.Blessing
N
2

You are actually profiling how long one pass through the for() loop takes. I get some more variability but 5 milliseconds is about right, console output is not very fast. Arbitrarily add some more std::cout statements to slow it down.

Necrotomy answered 7/10, 2011 at 12:36 Comment(7)
No, GetSystemTimeAdjustment does not return the clock tick interrupt period.Ehrsam
NtQueryTimerResolution (NTDLL.DLL) will return the clock tick. GetSystemTimeAdjustment will only let you know wheter a system time adjustment is active and what the update increments are.Drafty
Well, Why is a fprintf so slow may let you out of the blocking output. I'd suggest less output to really poll for the file time transition.Drafty
GetSystemTimeAdjustment() returns the clock interrupt period through the 2nd argument. 156001 on my machine, as expected and documented. This appears to be controversial so I just deleted it.Necrotomy
@HansPassant: it's not surprising that the time increment interval is set to the same value as the clock interrupt period by default (meaning that an adjustment will be made exactly once per clock tick) but according to the documentation they're not synonymous.Ehrsam
@HarryJohnston: The time increment interval does not neccessarely match the clock interrupt period. This particulary matters when the interrupt period is made short by timeBeginPeriod. Windows applies a correction beat frequency to compensate for this. See here for more details. At Hans: You're running windows 7 (the ending 1 in 156001 clearly identifies win7), on your sytem the time increment matches the interrupt period. But Harry is right: they are not synonymous and they should not be treated a equal.Drafty
I've done some experimentation and it looks as if I was wrong. As Arno points out, the underlying hardware interrupt period might be different from the system time resolution, but it seems that GetSystemTimeAdjustment does indeed return the system time resolution.Ehrsam
M
2

GetSystemTimeAsFileTime's resolution is dependent on the system. If seen it claimed that its between 10ms and 55ms. Commentators on the MSDN document put it at 15ms and "sub millisecond". What it actually is seems unclear but I've never seen its resolution claimed as equal to the 100 ns precision of the timestamp.

This means there's always going to be some variance and its also the reason people use QueryPerformanceFrequency instead.

Midgett answered 7/10, 2011 at 18:38 Comment(3)
I can't find those comments in the linked MSDN document. However, the granularity of GetSystemTimeAsFileTime() is given by the TimeIncrement returned by GetSystemTimeAdjustment(). This should not be mixed up with the capability of the system file time to represent 100ns units.Drafty
@Drafty I've updated the link to the older docs. Thanks for that. As an aside why do you believe there's a relationship between GetSystemTimeAsFileTime and GetSystemTimeAdjustmentMidgett
System filetime represents 100 ns units. But it does not increment in 100 ns steps. The filetime update quantum and rate are determined by a complex mechanism, which takes care about the whole variety of hardware. This includes the interrupt period and possible beat frequencies, which do occur if the interrupt period does not match the filetime quantum (TimeIncrement). Further reading: My answer below and this.Drafty

© 2022 - 2024 — McMap. All rights reserved.