I noticed that my animation suffers from artifacts that look like missed vblanks. No visible tearing, but sometimes the frame halts for a split second and then visibly jumps. I decided to measure the time between buffer swaps:
void draw_cb() {
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
glutSwapBuffers();
static auto last = std::chrono::high_resolution_clock::now();
auto now = std::chrono::high_resolution_clock::now();
std::cout
<< std::chrono::duration_cast<std::chrono::microseconds>(now - last).count()/1000.
<< '\n';
last = now;
}
To my surprise, I see times varying by as much as 1.5 millisecond, even in this completely undemanding routine. The times measured between frames are in the vicinity of 16.6 ms, but quite consistently on the higher side:
Nothing changes if I add an usleep
of a few milliseconds in the draw callback (unless more than 16 ms, obviously), confirming that it is not the drawing commands that causes the delayed response but waiting for the vsync. Why then don't I see values very close to 16.666 ms? What other measures could I take to make the animation smooth? I am certain my computer is fast enough.
Here are the relevant parts of how I set up freeglut
:
glutInitDisplayMode(GLUT_DOUBLE | GLUT_DEPTH | GLUT_RGBA);
glutDisplayFunc(draw_cb);
glutIdleFunc(glutPostRedisplay);
I also tried putting my draw callback into glutIdleFunc
, no difference.
The environment is Linux + Gnome 3 on Wayland, integrated graphics. Load average is well below 1. Looking closely, glxgears
show similar behaviour, reaching about 291 frames in 5 seconds in the default settings.
Update
With the great help of the commenters, I can now say this is due to the compositor. Running on X without the Wayland middle layer, I get a much sharper and well-centered distribution:
So the problem is specific to Wayland (or perhaps Gnome3 on Wayland). The question remains unchanged, though: how do I get a smooth animation in this setting with minimal changes? I'm OK to let go of freeglut if it's somehow not appropriate but I'd appreciate something equally simple and would like to keep a decorated, managed window, if possible. I updated the title.
std::chrono::high_resolution_clock
, so you may wanna verify that the one provided by your implementation is actually accurate to within what you expect. Note: talking about accuracy here, not just resolution. Also, note thatstd::chrono::high_resolution_clock
is not required to be stable. What time source are you using to base your animations on? – Caldronhigh_resolution_clock
is one nanosecond. (I can't guess about the accuracy, as you point out.) – Unconstitutional