glutInitContextVersion() is missing from glut library
Asked Answered
V

3

9

I am practicing some opengl code, how ever when i want to force the opengl context to use a certain version of opengl through glutInitContextVersion() it fails compilation process and gives this message:-

use of undeclared identifier 'glutInitContextVersion'

i want to fix this issue so i kept my code as simple as possible

code

#include "File.h"
#include <GLUT/GLUT.h>
#include <OpenGL/OpenGL.h>

using namespace std;

int  main ()
{

    glutInitContextVersion(3,2);

    return 1;
}

However i was able to use other glut functions without any error or warning messages

I am running Xcode 4.4.1 on Macbook air with OS X 10.9.1

Vociferation answered 26/2, 2014 at 23:9 Comment(0)
S
5

GLUT development ended many years ago and that is actually a non-standard extension (added in FreeGLUT). OS X ships with its own implementation of standard GLUT (3.x). Though since you mention OS X 10.9 in your question, it is worth pointing out that the compiler is going to generate all sorts of annoying deprecation warnings if you try to use it.

If you want to get a 3.2 core context on OS X using the Frameworks that ship with it, you will have to use CGL (C) or NSOpenGL (Objective C).

If you insist on using GLUT, you will need to find a port of FreeGLUT for OS X instead of the Framework that ships with the platform. This of course means tracking down additional dependencies, but I don't think that will be that big a deal. Just remember to stop including the GLUT headers that ship with Xcode.

Stickup answered 26/2, 2014 at 23:27 Comment(3)
I am not familiar with OSX API since I am new to MAC, I really avoid using glut however when i query the OpenGL version it returns 2.1 while Apple states that Macbook air with Intel HD Graphics 4000 supports OpenGL 4.1 that is why I used glut, and thanks for the answerVociferation
Ah, well on OS X you need a core profile context in order to get anything OpenGL 3.2 or greater. The GLUT framework that ships with OS X / Xcode will not let you do this, and some of the older interfaces like AGL will not either. You need to use CGL or NSOpenGL to get a 3.2+ context natively, or use FreeGLUT, SDL, glfw3, etc. to hide the low-level stuff from you. If you're just starting out with OS X, I would suggest one of the later solutions. Dealing with Cocoa (OS X's preferred Objective C window management framework) is a nightmare for most C++ developers starting out on OS X.Stickup
i'll try one of the tools you suggested, thank you for your help i really appreciate it.Vociferation
P
8

You can create a 3.2+ core profile context with the GLUT that comes with the Xcode version for OS X 10.9. You just have to use a different interface. Instead of calling glutInitContextVersion(), you need to add the GLUT_3_2_CORE_PROFILE flag to the glutInitDisplayMode() call:

glutInitDisplayMode(... | GLUT_3_2_CORE_PROFILE);

You will also need to include <OpenGL/gl3.h> before <GLUT/glut.h> to use GL3 and later features.

The whole thing will generate a bunch of compiler warnings since GLUT is marked as deprecated in OS X 10.9. My answer to a related questions contains instructions on how to disable those warnings: Glut deprecation in Mac OSX 10.9, IDE: QT Creator.

Portie answered 1/8, 2014 at 8:41 Comment(1)
Man, you rock. This information is REALLY hard to find for some reasonPassably
I
6

I had a similar problem, and I didn't #include <GL/freeglut.h>. "glutInitContextVersion()" is a new function added by freeglut, so you need to include freeglut in addition to glut.

Inadvertent answered 1/8, 2014 at 6:51 Comment(2)
Include freeglut instead of glut.Expletive
You're right. I deleted #include<GL/glut.h>. and it still compiled and ran fine.Inadvertent
S
5

GLUT development ended many years ago and that is actually a non-standard extension (added in FreeGLUT). OS X ships with its own implementation of standard GLUT (3.x). Though since you mention OS X 10.9 in your question, it is worth pointing out that the compiler is going to generate all sorts of annoying deprecation warnings if you try to use it.

If you want to get a 3.2 core context on OS X using the Frameworks that ship with it, you will have to use CGL (C) or NSOpenGL (Objective C).

If you insist on using GLUT, you will need to find a port of FreeGLUT for OS X instead of the Framework that ships with the platform. This of course means tracking down additional dependencies, but I don't think that will be that big a deal. Just remember to stop including the GLUT headers that ship with Xcode.

Stickup answered 26/2, 2014 at 23:27 Comment(3)
I am not familiar with OSX API since I am new to MAC, I really avoid using glut however when i query the OpenGL version it returns 2.1 while Apple states that Macbook air with Intel HD Graphics 4000 supports OpenGL 4.1 that is why I used glut, and thanks for the answerVociferation
Ah, well on OS X you need a core profile context in order to get anything OpenGL 3.2 or greater. The GLUT framework that ships with OS X / Xcode will not let you do this, and some of the older interfaces like AGL will not either. You need to use CGL or NSOpenGL to get a 3.2+ context natively, or use FreeGLUT, SDL, glfw3, etc. to hide the low-level stuff from you. If you're just starting out with OS X, I would suggest one of the later solutions. Dealing with Cocoa (OS X's preferred Objective C window management framework) is a nightmare for most C++ developers starting out on OS X.Stickup
i'll try one of the tools you suggested, thank you for your help i really appreciate it.Vociferation

© 2022 - 2024 — McMap. All rights reserved.