Is it possible to use WebGL max texture size?
Asked Answered
C

1

15

I am working on an app where higher resolution is always better.

But I'm stuck with WebGL max_texture_size problems. I create an image of size exactly this dimension (16384x16384 on my laptop) and WebGL crashes saying that:

GL_INVALID_ENUM : glBindFramebuffer: target was GL_READ_FRAMEBUFFER_ANGLE GL_INVALID_ENUM : glBindFramebuffer: target was GL_READ_FRAMEBUFFER_ANGLE WebGL: CONTEXT_LOST_WEBGL: loseContext: context lost

And the same happen when I try with 0.75 of the max values. Only working at half the max resolution works but that means only 1/4th of the pixels in my graphic memory are used!

So my question is whether it is possible to use the max texture size and if not, how can one find the biggest texture WebGL can eat? I found very few (if any) documentation about this online.

Just in case, Here is how I bind my texture:

gl.activeTexture(glTexture);
texture = gl.createTexture();
gl.bindTexture(gl.TEXTURE_2D, texture);
gl.pixelStorei(gl.UNPACK_FLIP_Y_WEBGL, true);
gl.texImage2D(gl.TEXTURE_2D, 0, gl.RGBA, gl.RGBA, gl.UNSIGNED_BYTE, image);
with(gl){
    texParameteri(TEXTURE_2D, TEXTURE_MAG_FILTER, LINEAR);
    texParameteri(TEXTURE_2D, TEXTURE_MIN_FILTER, NEAREST);
    texParameteri(TEXTURE_2D, TEXTURE_WRAP_S, CLAMP_TO_EDGE);
    texParameteri(TEXTURE_2D, TEXTURE_WRAP_T, CLAMP_TO_EDGE);
}
program.samplerUniform = gl.getUniformLocation(program, textureName);
gl.uniform1i(program.samplerUniform, textureNb);
Crutch answered 30/4, 2015 at 19:4 Comment(0)
T
36

gl.getParameter(gl.MAX_TEXTURE_SIZE) returns the maximum dimension the GPU can address. In other words, you can make a 1xMAX or a MAXx1 texture and it should work all the way up to MAXxMAX if you have the memory.

On the other hand you can not make (MAX+1)x1 because MAX+1 is > MAX.

But, MAXxMAX, well, in your case

16384 * 16384 * 4 (RGBA) * UNSIGNED_BYTE = 1073741824 or 1GIG!!!

Does your GPU have 1Gig? In fact it will need more than 1Gig because it's using memory for the OS and Chrome itself and mips and whatever else. Maybe better example

16384 * 16384 * 4 (RGBA) * FLOAT = 4294967296 or 4GIG!!!

Even if it your GPU has lots of memory the browsers might limit the amount of memory you can access. Most likely your browser is running out of memory which is why it's crashing.

As for knowing the limit on any device there is arguably no way to know that. First off you can't check which GPU the user's machine has except in Chrome (which other browsers argue is a privacy issue). Second you don't know what else is running on their machine and taking up memory.

Tedra answered 1/5, 2015 at 11:24 Comment(6)
Ok thanks, I missunderstood the meaning of this max value. That actually makes sense now. But how should I go about finding the biggest texture size that will fit in memory (with some margin) I guess tabs are run in a sandbox and cannot guess their available memory?Crutch
Even if they were not run in a sandbox how do you know they don't have 50 tabs open with a WebGL app in each tab? There really isn't a way that I know of except to allocate a progressively larger texture and pray. If you're making an app try to use as little memory as possible/reasonable. You might be able to look at navigator.platform to guess a little, at least mobile vs desktop. Otherwise consider giving the user the option for hi detail vs low detail like most PC games.Tedra
Sorry, but I think this might be a mistake. From the Khronos spec: "For 1D and 2D textures (and any texture types that use similar dimensionality, like cubemaps) the max size of EITHER dimension is GL_MAX_TEXTURE_SIZE."Nickelic
What's a mistake? GL_MAX_TEXTURE_SIZE is just a constant you use to then query by calling glGetIntegerv in OpenGL as in GLint maxTextureSize; gl.getIntergerv(GL_MAX_TEXTURE_SIZE, &maxTextureSize); or in WebGL it's var maxTextureSize = gl.getParameter(gl.MAX_TEXTURE_SIZE); As pointed out above many GPUs return 16384 as their max size. 16384*16384*4 floats (a floating point texture) would be 4Gig. Many GPUs don't even have 4Gig of memoryTedra
I think the interpretation that a GL_MAX_TEXTURE_SIZE of 16384 allows for a "1xMAX or a MAXx1" texture size is misleading. I too get a result of 16384, and my card has 8GB VRAM, so I expect I'll be able to utilize a MAXxMAX texture. Would you agree? In other words, I thought you were suggesting that 16384 represents the maximum total amount of pixels allowable in the texture.Nickelic
Sorry if I wasn't clear. MAX_TEXTURE_SIZE is the largest dimension the GPU can address. It has nothing to do with whether or not that will fit in memory. So yes IF you have enough memory you can make MAXxMAX (congratulations on your 8GIG GPU). Plenty of mobile GPUs return MAX=16836 but do not have 4GIG of ram needed for a 16384x16384xRGBAxFLOAT texture. But, regardless of how much memory you have you can not make (MAX+1)x1 texture.Tedra

© 2022 - 2024 — McMap. All rights reserved.