(OpenGL) How to read back texture buffer?
Asked Answered
H

1

9

Is glGetBufferSubData used both for regular and texture buffers ? I am trying to troubleshoot why my texture is not showing up and when I use glGetBufferSubData to read the buffer I get some garbage

struct TypeGLtexture //Associate texture data with a GL buffer
{
    TypeGLbufferID GLbuffer;
    TypeImageFile  ImageFile;

    void GenerateGLbuffer   ()
    {
        if (GLbuffer.isActive==true || ImageFile.GetPixelArray().size()==0) return;
        GLbuffer.isActive=true;
        GLbuffer.isTexture=true;
        GLbuffer.Name="Texture Buffer";
        GLbuffer.ElementCount=ImageFile.GetPixelArray().size();

        glEnable(GL_TEXTURE_2D);
        glGenTextures (1,&GLbuffer.ID);              //instantiate ONE buffer object and return its handle/ID
        glBindTexture (GL_TEXTURE_2D,GLbuffer.ID);   //connect the object to the GL_TEXTURE_2D docking point
        glTexImage2D (GL_TEXTURE_2D,0,GL_RGB,ImageFile.GetProperties().width, ImageFile.GetProperties().height,0,GL_RGB,GL_UNSIGNED_BYTE,&(ImageFile.GetPixelArray()[0]));

if(ImageFile.GetProperties().width==6){
    cout<<"Actual Data"<<endl;
    for (unsigned i=0;i<GLbuffer.ElementCount;i++) cout<<(int)ImageFile.GetPixelArray()[i]<<" ";
    cout<<endl<<endl;

    cout<<"Buffer data"<<endl;
    GLubyte read[GLbuffer.ElementCount]; //Read back from the buffer (to make sure)
    glGetBufferSubData(GL_TEXTURE_2D,0,GLbuffer.ElementCount,read);
    for (unsigned i=0;i<GLbuffer.ElementCount;i++) cout<<(int)read[i]<<" ";
    cout<<endl<<endl;} 
}

enter image description here



EDIT: Using glGetTexImage(GL_TEXTURE_2D,0,GL_RGB,GL_UNSIGNED_BYTE,read);
the data still differs: enter image description here

Harpsichord answered 21/8, 2015 at 21:29 Comment(1)
According to the documentation, GL_TEXTURE_2D is not one of the allowed parameters for glGetBufferSubData. Do you mean GL_TEXTURE_BUFFER? Actually, I don't see anywhere in the code a hint that you really work on a texture buffer. In case you mean a normal texture instead, then glGetTexImage is your friend.Seeley
I
10

Yes, this would work for texture buffers, if this were in fact one of those.

glGetBufferSubData (...) is for Buffer Objects. What you have here is a Texture Object, and you should actually be getting API errors if you call glGetError (...) to check the error state. This is because GL_TEXTURE_2D is not a buffer target, that is a type of texture object.

It is unfortunate, but you are confusing terminology. Even more unfortunate, there is something literally called a buffer texture (it is a special 1D texture) that allows you to treat a buffer object as a very limited form of texture.

Rather than loosely using the term 'buffer' to think about these things, you should consider "data store." That is the terminology that OpenGL uses to avoid any ambiguity; texture objects have a data store and buffer objects do as well. Unless you create a texture buffer object to link these two things they are separate concepts.

Reading back data from a texture object is much more complicated than this.

Before you can read pixel data from anything in OpenGL, you have to define a pixel format and data type. OpenGL is designed to convert data from a texture's internal format to whatever (compatible) format you request. This is why the function you are actually looking for has the following signature:

void glGetTexImage (GLenum      target,
                    GLint       level,
                    GLenum      format, // GL will convert to this format
                    GLenum      type,   // Using this data type per-pixel
                    GLvoid *    img);

This applies to all types of OpenGL objects that store pixel data. You can, in fact, use a Pixel Buffer Object to transfer pixel data from your texture object into a separate buffer object. You may then use glGetBufferSubData (...) on that Pixel Buffer Object like you were attempting to do originally.

Impeachable answered 21/8, 2015 at 21:39 Comment(4)
Switched to --> glGetTexImage(GL_TEXTURE_2D,0,GL_RGB,GL_UNSIGNED_BYTE,read); -->but the data is still not identical ... I hate openGL right now.Harpsichord
@ThomasAn: What are the dimensions of your original image data? There's a caveat I didn't mention related to RGB image formats and alignment. Pixel data rows are expected to begin on 4-byte boundaries when transferring data to/from OpenGL. If your image dimensions are not a power-of-two you will run into alignment issues. glPixelStorei (GL_UNPACK_ALIGNMENT, 1) will fix this up (unpacking is what happens when you upload data, packing is when you download).Impeachable
My image was 6x6 ... and here I thought dealing with BMP row padding was all I needed to take care of (and that the power of two thing was only for GL prior to 3.0) ... but the gotcas never seem to end. Thanks again for this tip.Harpsichord
@ThomasAn: Ah, actually if you leave the BMP data the way it is and don't touch the pixel store parameters you will automatically be feeding GL data the way it wants. The power-of-two thing isn't exactly what you think, it's really just a happy coincidence that 3x any power-of-two > 2 is divisble by 4 (this is the rule you need to concern yourself with). There's a much larger set of dimensions that will satisfy row alignment, but power-of-two row widths > 2 always do and it's easier to explain that way ;)Impeachable

© 2022 - 2024 — McMap. All rights reserved.