glutMainLoop() vs glutTimerFunc()?
Asked Answered
L

2

5

I know that glutMainLoop() is used to call display over and over again, maintaining a constant frame rate. At the same time, if I also have glutTimerFunc(), which calls glutPostRedisplay() at the end, so it can maintain a different framerate.

When they are working together, what really happens ? Does the timer function add on to the framerate of main loop and make it faster ? Or does it change the default refresh rate of main loop ? How do they work in conjunction ?

Lyricist answered 23/3, 2018 at 9:49 Comment(0)
E
6

I know that glutMainLoop() is used to call display over and over again, maintaining a constant frame rate.

Nope! That's not what glutMainLoop does. The purpose of glutMainLoop is to pull operating system events, check if timers elapsed, see if windows have to be redrawn and then call into the respective callback functions registered by the user. This happens in a loop and usually this loop is started from the main entry point of the program, hence the name "main - loop".

When they are working together, what really happens ? Does the timer function add on to the framerate of main loop and make it faster ? Or does it change the default refresh rate of main loop ? How do they work in conjunction?

As already told, dispatching timers is part of the responsibility of glutMainLoop, so you can't have GLUT timers without that. More importantly if there happened no events and no re-display was posted and if there's not idle function registerd, glutMainLoop will "block" the program until some interesting happens (i.e. no CPU cycles are being consumed).

Essentially it goes like

void glutMainLoop(void)
{
    for(;;){
        /* ... */
        foreach(t in timers){
            if( t.elapsed() ){
                t.callback(…);
                continue;
            }
        }
        /* ... */
        if( display.posted ){
             display.callback();
             display.posted = false;
             continue;
        }
        idle.callback();
    }
}

At the same time, if I also have glutTimerFunc(), which calls glutPostRedisplay() at the end, so it can maintain a different framerate.

The timers provided by GLUT make no guarantees about their precision and jitter. Hence they're not particularly well suited for framerate limiting.

Normally the framerate is limited by v-sync (or it should be), but blocking on v-sync means you can not use that time to do something usefull, because the process is blockd. A better approach is to register an idle function, in which you poll a high resolution timer (on POSIX compliant systems clock_gettime(CLOCK_MONOTONIC, …), on Windows QueryPerformanceCounter) and perform a glutPostRedisplay after one display refresh interval minus the time required for rendering the frame elapsed.

Of course it's hard to predict how long rendering is going to take exactly, so the usual approach is to collect sliding window average and deviation and adjust with that. Also you want to align that timer with v-sync.

This is of course a solved problem (at least in electrical engineering) which can be addressed by a Phase Locked Loop. Essentially you have a "phase comparator" (i.e. something that compares if your timer runs slower or faster than something you want synchronize to), a "charge pump" (a variable you add to or subtract from the delta from the phase comparator), a "loop filter" (sliding window average) and an "oscillator" (a timer) controlled by the loop filtered value in the charge pump.

So you poll the status of the v-sync (not possible with GLUT functions, and not even possible with core OpenGL or even some of the swap control extensions – you'll have to use OS specific functions for that) and compare if your timers lag beind or run fast compared to that. You add that delta to the "charge pump", filter it and feed the result back into the timer. The nice thing about this approach is, that this will automatically adjust to and filter the time spent for rendering frames as well.

Elyn answered 23/3, 2018 at 12:44 Comment(6)
Is there any link between the timer clock rate and my monitors refresh rate ? What if the glutTimerFunc frequency is faster than my monitors refresh rate ?Lyricist
@DhruvChadha: Synchronized clock sources? In a PC? You wish. That's why one has to jump all those hoops, just to have a decent chance at accurately aligning things in the temporal domain. Things would be so much easier if PCs had a high resolution master clock from which everything else is derived. But that's not how it is. 1980-ies home computers did have that. And so does broadcast and movie production gear (at least when it comes to audio/video clock).Elyn
OK, but in case glutTimerFunc freq > refresh rate, how will monitor try to display something faster than its own capacity ?Lyricist
@DhruvChadha: If v-sync is enabled (which should be done by default) then double buffer swaps (issued by glutSwapBuffers or similar) are synchronized to the display refresh, which means that the call of the buffer swap function will block (=pause the program) until the moment the next frame is sent to the display. If v-sync is disabled and rendering is not synchronized otherwise screen tearing happens: en.wikipedia.org/wiki/Screen_tearingElyn
@DhruvChadha: Note that v-sync is a function provided by the GPU. Essentially after rendering a frame, calling the buffer swap tells the GPU: "Hey, here's a pointer to the next frame that I'd like to be shown to the user; please wake me up, when it's done." The GPU will then use it's very own clock (which drives the display) to determine, when to actually perform the buffer swap, so that the buffer swap happens right between the end of transmission of one frame to the display and start of the next. When that happened, the GPU sends an interrupt to the CPU, which wakes up the program.Elyn
@DhruvChadha: Note, that some OpenGL implementations (most notably the Mesa/Intel driver) do not block on the buffer swap, but on the first call to a function that would cause alteration of the contents of the back buffer, before it was swapped into the display scanout front buffer.Elyn
P
1

From the glutMainLoop doc pages:

glutMainLoop enters the GLUT event processing loop. This routine should be called at most once in a GLUT program. Once called, this routine will never return. It will call as necessary any callbacks that have been registered. (grifos mine)

That means that the idea of glutMainLoop is just processing events, calling anything that is installed. Indeed, I do not believe that it keeps calling display over and over, but only when there is an event that request its redisplay.

This is where glutTimerFunc() comes into the play. It register a timer event callback to be called by glutMainLoop when this event is triggered. Note that this is one of several possible others event callbacks that can be registered. That explains why in doc they use the expression at least.

(...) glutTimerFunc registers the timer callback func to be triggered in at least msecs milliseconds. (...)

Pneumococcus answered 23/3, 2018 at 12:2 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.