Godot Supersampling Anti-Aliasing (SSAA) + Render Scaling
Asked Answered
C

11

0

Was helping @Nanites with their project, but I think this deserves it's own thread. I implemented supersampling anti-aliasing (SSAA) for 2D GLES3 projects which also has a render scale slider (10% to 200%). In the video, at the highest settings, I am rendering to 4K and supersampling down to 1080p and the performance is still quite good. Planning to publish to the AssetLib maybe tomorrow and I'm working on a 3D version now. Project is attached.

Counter answered 13/12, 2021 at 5:6 Comment(0)
C
0

I have a new version that supports GLES2. I had to implement the partial derivatives myself in shader code, but it wasn't too hard. Performance is much better now. I can get 5,000 fps at 5K!!!

Counter answered 13/12, 2021 at 15:19 Comment(0)
C
0

So I fixed a major bug and did a small optimization. Also, even though I said it was SSAA, it works at any resolution. Here is a shot at 25% scale with just the shader (texture filtering disabled).

However, I tested it on my cheap $200 Intel system (HD 500) and performance is pretty bad. Around 15 fps at an 8K render (sampled down to 4K). Of course, that is ridiculous and no one would really render at 8K on a machine like that, but I have to figure out how to get it to work on cheap PCs or mobile devices.

Counter answered 14/12, 2021 at 3:41 Comment(0)
C
0

Here is how it looks with an image at full quality 4K.

Counter answered 14/12, 2021 at 11:4 Comment(0)
C
0

So I must have done some John Carmack Voodoo Magic. I don't understand it, but my shader seems to be increasing performance even when doing nothing. Meaning I render the scene to an offscreen viewport, and then simply sample the pixels 1:1 and render them on a fullscreen quad. It looks exactly the same, but I'm getting around 30% more performance by doing this. It doesn't make any sense to me. So now I can set the render scale to 125%, gain image quality, at the same as native performance. I mean, I'm happy, but it doesn't make sense.

Counter answered 15/12, 2021 at 1:10 Comment(0)
T
0

Might be a driver level optimization is kicking in, nvidia in particular tends to have such things going on in their proprietary drivers.

Trilateration answered 15/12, 2021 at 3:21 Comment(0)
C
0

Well I'm on AMD and Linux. You're probably right, some sort of driver optimization. Strange though.

Counter answered 15/12, 2021 at 3:30 Comment(0)
C
0

So it's almost finished. I got it working in 3D and it works perfectly. I'm able to achieve what looks like a 4K render on a 1080p monitor. Well maybe not quite as good as a real 4K monitor, but pretty close. This is similar to Nvidia DSR or AMD VSR probably more than SSAA. I still have a few bugs with one of the other options, but this part seems to be working.

Counter answered 15/12, 2021 at 4:5 Comment(0)
C
0

Here are the highest and lowest quality shots. It's basically finished, but I need to add a few more things before publishing to github.

Highest quality 4K to 1080p on a 6800 XT getting around 1000 fps.

Lowest quality 540p to 1080p on an HD 500 getting around 80 fps.

Counter answered 15/12, 2021 at 16:22 Comment(0)
C
0

So 3D is fully working. I got it integrated into Decay and the results are nice. Even at 100% scale, my shader does seem to increase picture quality a little bit. However, going above 100% in a graphics heavy demo is not viable, I already have pushed the graphics to the limit so there is no extra performance. 50% scale looks good, though, and should help people on older computers play my demo (which was the main purpose anyway).

Counter answered 16/12, 2021 at 4:3 Comment(0)
C
0

I'm crazy, so I rewrote the scaler shader from scratch tonight. As the super sampling wasn't really giving me the performance I wanted anyhow, I decided to prioritize the upsampling part of it. Also, the previous version was using code based on the public domain, but I felt a little bit like I was cheating so I reimplemented from scratch with my own code. This new version retains details much better, even as low as 540p and still looks pretty nice. However, since the image is much sharper, you might have to add FXAA at lower resolutions (or even higher for performance reasons) to get a more stable image. Also, it is not doing AA, so you'll still need to use MSAA or FXAA. The previous version was doing some sort of AA, but performing much worse than MSAA/FXAA so it wasn't worth it. Performance is pretty decent too, at least on an okay GPU. I tested on the Intel HD 500, and it is just north of 60 fps, but with a simple test scene so would probably not be usable in a real game (in that case you'd just have to reduce the resolution in the project settings and live with bilinear interpolation).

Native render 1080p:

4K render sampled to 1080p:

540p render sampled to 1080P:

On the 540p image in particular, you can notice a lot of high-frequency detail is lost on the cube (and this is unavoidable, the information just isn't there), but if you look on the floor and columns, there is very little loss to quality. The background too, while pixelated, looks much better than you'd expect from 540p and tons better than bilinear filtering. In this case I had FXAA off so you could see my shader, but at lower resolutions you would enable it and it gets rid of most of the pixel artifacts.

Counter answered 16/12, 2021 at 14:6 Comment(0)
C
0

Here are some better shots after tweaking some things.

4K:

1080p:

540p:

Counter answered 16/12, 2021 at 16:13 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.