I can send color to shader as 4 floats - no problem. However I want to send it as integer (or unsigned integer, doesn't really matter, what matters is 32 bits) and be decomposed in vec4 on shader.
I'm using OpenTK as C# wrapper for OpenGL (although it should be pretty much just a direct wrapper).
Let's consider one of the most simple shaders with vertex containing position(xyz) and color(rgba).
Vertex shader:
#version 150 core
in vec3 in_position;
in vec4 in_color;
out vec4 pass_color;
uniform mat4 u_WorldViewProj;
void main()
{
gl_Position = vec4(in_position, 1.0f) * u_WorldViewProj;
pass_color = in_color;
}
Fragment shader:
#version 150 core
in vec4 pass_color;
out vec4 out_color;
void main()
{
out_color = pass_color;
}
Let's create vertex buffer:
public static int CreateVertexBufferColor(int attributeIndex, int[] rawData)
{
var bufferIndex = GL.GenBuffer();
GL.BindBuffer(BufferTarget.ArrayBuffer, bufferIndex);
GL.BufferData(BufferTarget.ArrayBuffer, sizeof(int) * rawData.Length, rawData, BufferUsageHint.StaticDraw);
GL.VertexAttribIPointer(attributeIndex, 4, VertexAttribIntegerType.UnsignedByte, 0, rawData);
GL.EnableVertexAttribArray(attributeIndex);
GL.BindBuffer(BufferTarget.ArrayBuffer, 0);
return bufferIndex;
}
And I'm getting all zeros for vec4 'in_color' in vertex shader. Not sure what's wrong.
Closest thing what I found: https://www.opengl.org/discussion_boards/showthread.php/198690-I-cannot-send-RGBA-color-as-unsigned-int .
Also in VertexAttribIPointer I'm passing 0 as a stride, because I do have VertexBufferArray and keep data separated. So colors come tightly packed 32 bits (per color) per vertex.
ivec4
for an integral datatype – Orbicularin vec4
in conjunction withglVertexAttribPointer
. NotglVertexAttribIPointer
. – Ruvolo