The following functions are duplicated between opengl32.dll
and gdi32.dll
:
WGL | GDI |
---|---|
wglChoosePixelFormat |
ChoosePixelFormat |
wglDescribePixelFormat |
DescribePixelFormat |
wglGetPixelFormat |
GetPixelFormat |
wglSetPixelFormat |
SetPixelFormat |
wglSwapBuffers |
SwapBuffers |
I have been searching for an answer for a long time now, but noone appears to have any concrete information why that is and what their exact difference is.
The OpenGL FAQ, section 5.190, suggests that these functions are not functionally identical:
To ensure correct operation of OpenGL use ChoosePixelformat, DescribePixelformat, GetPixelformat, SetPixelformat, and SwapBuffers, instead of the wgl equivalents, wglChoosePixelformat, wglDescribePixelformat, wglGetPixelformat, wglSetPixelformat, and wglSwapBuffers. In all other cases use the wgl function where available. Using the five wgl functions is only of interest to developers run-time linking to an OpenGL driver.
Does "run-time linking to an OpenGL driver" imply bypassing opengl32.dll
and loading an ICD directly?
A StackOverflow thread named "Mesa3D does not like my context creation code", appears to reinforce this.
Another StackOverflow thread, named "wglCreateContext in C# failing but not in managed C++" suggests that opengl32.dll
must be loaded before gdi32.dll
when using the GDI functions, or risk runtime failure ("error: 2000").
My own testing indicates that "error: 2000" occurs on some systems (Nvidia, but not Intel or a Parallels VM) if the WGL version of these functions is called. Changing to the GDI version clears this issue, but using LoadLibrary("opengl32.dll")
does not appear to change anything.
Has anyone ever investigated the difference between these WGL and GDI functions? It is clear that there is some form of difference, and I am trying to understand which version should be used under which circumstances and what are the potential pitfalls if the wrong version is used.
Edit: Wayback Machine brings up a webpage that describes how direct loading of an ICD works. This was apparently required back in the Voodoo 1/2 days when the 2D and 3D accelerators were two different pieces of hardware with separate ICDs (which the normal, single-ICD mechanism in opengl32.dll
couldn't handle). Quake 1 and 2 would apparently load ICDs directly because of this.
However, a post below shows that the AMD ICD does not export the wgl*
variants, which contradicts this idea.
There has to be someone or some place out there that holds the keys to this knowledge.
Edit 2: From the webpage above comes the clearest suggestion yet:
Therefore if you are using a OpenGL driver named opengl32.dll you must call the GDI functions, and if you are not using a driver named opengl32.dll you must NOT call the GDI functions.
But how does this fit in with the fact that the AMD ICD does not export WGL functions?
Edit 3: Apparently Mesa 3D exports WGL symbols, as can be seen here: http://cgit.freedesktop.org/mesa/mesa/tree/src/mesa/drivers/windows/gdi
This makes sense, since Mesa3D is not supposed to be used as an ICD. This fits with the pattern in the Mesa3D thread linked above: their calls are not being routed through Microsoft's opengl32.dll
, so GDI functions fail, but Mesa3D is exporting wgl*
functions so these still work. However, that's specific to Mesa3D - that method would fail if you tried to use AMD's ICD directly.
LoadLibrary (...)
stuff. What language are you using, by the way? Most often when this question comes up it is in the context of C#, whereLoadLibrary (...)
is a much more common practice. In C and C++, most people link against opengl32.lib / gdi32.lib and use the GDI32 functions. – AriannearianrhodDrv...
, which is why ICDs do not exportwgl...
functions. And as far as Quake 1 and 2 go, they did not use ICDs. They had MiniGL drivers that were awful things; they only implemented exactly the functions that Quake 1 and 2 needed, and usually would not work with any other games. – Ariannearianrhod