OpenGL ES2 Alpha test problems
Asked Answered
M

2

13

I am rendering in 3D multiple objects with textures that have alpha. All the textures load fine but when I try to render them in front of each other I get the following:

Left is what I have. Right is what I want

Left is what I have. Right is what it should be. The grid is just to help visualize the perspective.

The texture in front of the red circle texture is clipped. I searched around for an answer and it says for me to use:

GLES20.glEnable( GLES20.GL_BLEND );
GLES20.glBlendFunc(GLES20.GL_SRC_ALPHA, GLES20.GL_ONE_MINUS_SRC_ALPHA );

But I am using it and it still isn't working. My setup in which I correctly placed in the onSurfaceCreated() function is:

GLES20.glClearColor( 0.75f, 0.85f, 1f, 1.0f );
GLES20.glEnable( GLES20.GL_BLEND );
GLES20.glBlendFunc(GLES20.GL_SRC_ALPHA, GLES20.GL_ONE_MINUS_SRC_ALPHA );
GLES20.glEnable( GLES20.GL_DEPTH_TEST );
GLES20.glDepthFunc( GLES20.GL_LEQUAL );
GLES20.glDepthMask( true );
GLES20.glClearDepthf( 1f );

My fragment shader is:

uniform sampler2D texture;
varying vec2 texCoord;  
void main(){ 
   gl_FragColor = texture2D( texture, texCoord );
} 

Do I have to include anything in the Android manifest to enable alpha testing? I do not want to end up having to manually organize my polygons or use alpha discard() because I need and want some pixels to be translucent.

How do I get 3D alpha testing depth buffers to work?

Moneychanger answered 24/4, 2014 at 22:1 Comment(11)
Are you sure you have a depth renderbuffer attached to your framebuffer?Buhr
You have to do that? I thought it was automatic for Android. How do I go about doing that?Moneychanger
I never used OpenGL on Android, only on iOS, but I guess it will be the same, since you also use OpenGL ES 2.0. However, I guess you could search for a tutorial... Do you have any code that looks like this? puu.sh/8m7i7.pngBuhr
Take a look at this question as well: #11867186Buhr
Seems related as well: #2739086Buhr
I just tried putting view.setEGLConfigChooser(true); like the question says in the beginning of my App when the view is created and still no dice. Also I don't have any genFrameBuffer code. I do have that code on my Windows engine but I'm somewhat certain that Android has one created automatically. Otherwise I would not been able to render the red and blue textures to begin with.Moneychanger
You need to draw back to front for this form of blending to work. In this case, you need to draw the red circle before the blue one. I suspect that you might be drawing them in the wrong order.Volotta
@RetoKoradi But I explicitly stated that I do not want to have to manually change the polygon order. The application has a dynamic camera and the order might change. I know there are games out there that have achieved what I want without having to manually set the order. Any other ideas?Moneychanger
@sgtHale: You don't have a choice if you want to get this to work with blending. If you only need to support alpha=0 and alpha=1, there's another simple approach. Instead of using blending, discard the fragments in the fragment shader based on their alpha values. I can write it up as an answer later today if that sounds like the approach you're looking for, and need more detail.Volotta
@Reto Koradi. Is there really no other way? I though there were some applications that had alpha blend depth buffers with alpha = 0-1f. But yes I'd be glad to accept your small approach as my answer.Moneychanger
@sgtHale: There are one or two other options I can think of that haven't been mentioned yet. At least one of them is not supported by ES2 features, though. I'll type up an answer when I get home.Volotta
V
35

Here is an overview of a few methods to render with transparency in OpenGL, with advantages and disadvantages for each.

Alpha Testing

This is a very limited method, but is sufficient for the specific case the poster asked about. The example shown does not really need transparency because everything is either fully opaque or fully transparent (alpha = 1.0 or alpha = 0.0).

There used to be an alpha test for this purpose in OpenGL, but that is a deprecated feature, and is of course not in ES. You can emulate the same thing in your fragment shader, which will look something like this:

vec4 val = texture2D(tex, texCoord);
if (val.a > 0.5) {
    gl_FragColor = val;
} else {
    discard;
}

Advantages:

  • Simple.
  • No additional work on app side.

Disadvantages:

  • Only works for full opacity/transparency, cannot deal with semi-transparency.
  • Can hurt performance because it typically means that depth testing before the fragment shader has to be disabled.

Sorting and Blending

Rendering transparency is a primary use case for blending. The most common approach is to set the blend function toSRC_ALPHA, ONE_MINUS_SRC_ALPHA, enable blending, and render with the alpha component of the rendered fragments containing the desired opacity.

If the scene contains a mixture of fully opaque objects and objects with transparency, the fully opaque objects can be rendered first, without a need for them to be sorted. Only the objects with transparency need to be sorted. The sequence is then:

  1. Render fully opaque geometry.
  2. Render non-opaque geometry, sorted back to front.

Advantages:

  • Can handle semi-transparency.
  • Can handle multiple layers of transparent geometry.
  • Rendering itself is very efficient.

Disadvantages:

  • Requires sorting for correct result. For the blend function mentioned above, geometry has to be rendered back to front. Depending on the application, this can be between no big deal and almost impossible. For example, to correctly render intersecting geometry, you may have to start splitting triangles, which is far from attractive.

Depth Peeling

This is a very clever use of OpenGL features, IMHO, and can be a good practical solution. It does require multiple rendering passes. The simple form requires 3 passes:

  1. Render scene with the usual settings (depth testing enabled, depth function LESS, color and depth write enabled), but render only the fully opaque geometry. If opacity is per object, you can handle that by skipping draw calls for non-opaque objects. Otherwise, you will have to discard non-opaque fragments with a shader similar to the one under Alpha Testing above.
  2. Render the non-opaque geometry with the same settings as above, except that color write is disabled.
  3. Render the non-opaque geometry again, but this time with depth function EQUAL, color write enabled again, depth write disabled, and using blending.

A minimal shader can be used for pass 2, since it does not need to produce any valid fragment colors.

Advantages:

  • Simple to implement.
  • Reasonably efficient, does not require sorting.
  • Correctly handles semi-transparency.

Disadvantages:

  • Simple form only draws front-most layer of transparent geometry. This may sound like a major limitation, but the results can actually look very good. There are more advanced forms where additional layers are rendered with additional passes. Beyond the overhead for the those additional passes, it also gets more complex because it requires multiple depth buffers. I believe there's a white paper about it on the NVIDIA web site.

Alpha to Coverage

I haven't used this myself, so the following is based on my limited theoretical understanding. It looks like another interesting method. This requires multi-sampled rendering. The feature is enabled with glEnable(GL_SAMPLE_ALPHA_TO_COVERAGE), which then translates alpha values into a coverage mask, resulting in only parts of the samples being written depending on alpha value. This results in a transparency effect when the multi-sample buffer is downsampled to the final color buffer.

Advantages:

  • Can handle semi-transparency.
  • Correctly handles multiple layers of transparency.
  • Efficient, particularly if MSAA would have been used anyway. No sorting required.

Disadvantages:

  • Requires MSAA. Modern GPUs are very effective at MSAA rendering, so this is no huge deal. Often times, you will probably want to use MSAA anyway.
  • Effective resolution of alpha value is very small, unless I'm missing something. For example, with 4x MSAA, you can only represent 5 possible alpha values (0, 1, 2, 3, 4 samples set in coverage mask).
Volotta answered 25/4, 2014 at 2:38 Comment(5)
Great analysis. I guess Alpha testing will be sufficient for android.Moneychanger
Note that alpha testing is very bad for the performance, because of the branch you have. Having ifs or loops in your fragment shader is usually very bad.Buhr
@MartijnCourteaux: You obviously wouldn't want unnecessary branches. But if you need them, I don't think performance will be that bad. Have you benchmarked it on modern hardware? I would actually be much more concerned about the performance impact of the discard, because it often requires disabling of early depth testing.Volotta
@T_01: For the sorting mentioned here under "Sorting and Blending", this is something you have to do in your own code. You can use any sorting algorithm you like. Start here if you need an overview: en.wikipedia.org/wiki/Sorting_algorithm.Volotta
In fact, since sgtHale is talking about Android, he may be using a PowerVR/Adreno/Mali GPU. In which case, not only is sorting opaque geometry not necessary, it is actually a silly thing to do. TBDR GPUs sort opaque geometry when they do deferred tiling, so any benefit you might have gotten from sorting front-to-back on the CPU is just a waste of CPU cycles ;) discard also breaks TBDR GPU efficiency, since they are basically early depth testing on steroids.Shelve
E
-1

The accepted answer doesn't cover it.

In your case the solution is to disable depth testing and possibly reverse the order you are drawing.

The shader discard is a hack that will look awful if you need a range of alpha values.

Expressman answered 10/2, 2017 at 7:52 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.