Why BGRA instead of RGBA?
Asked Answered
C

1

8

So, I was developing a game engine in C++ and DirectX 11, and I noticed that people usually uses BGRA instead of RGBA as SwapChain format. Why people does that? And why is 32 bit color instead 256 bit color?

Clementia answered 26/12, 2022 at 23:59 Comment(3)
And why is 32 bit color instead 256 bit color? Did you mean 8 bit color with a palette?Sams
Because AFAIK it's quite common for the display controller in your GPU to work with BGRA instead of RGBA and providing graphics in the format of the display controller eliminates the need to convert it.Illuminism
I found a similar discussion about vulkan and BGRA: https://www.reddit.com/r/vulkan/comments/p3iy0o/why_use_bgra_instead_of_rgba/Sams
E
9

Historically, Direct3D has supported whatever formats the video cards wanted to expose, and it supported both RGBA and BGRA formats.

For Direct3D 10, there was an active effort to simplify the support matrix for developers and one of the areas was to try to standardize on RGBA only formats. Direct3D 10 / DXGI 1.0 only supported RGBA formats as a result.

For Direct3D 11, the older BGRA formats were added again in DXGI 1.1 because many of the Direct3D 9 era drivers still preferred them to support the 10level9 Direct3D Hardware Feature Levels. 16bpp formats (which are BGRA) were also added to support 'mobile-class' GPUs in DXGI 1.2.

B5G6R5 and B5G5R5A1 were defined in DXGI 1.1 but weren't supported by any driver until DXGI 1.2.

Thus, the modern DXGI format list is mostly RGBA only but also has BGRA formats for:

DXGI_FORMAT_B8G8R8A8_UNORM
DXGI_FORMAT_B8G8R8A8_UNORM_SRGB
DXGI_FORMAT_B8G8R8X8_UNORM
DXGI_FORMAT_B8G8R8X8_UNORM_SRGB


DXGI_FORMAT_B5G6R5_UNORM
DXGI_FORMAT_B5G5R5A1_UNORM
DXGI_FORMAT_B4G4R4A4_UNORM

In terms of swap-chain formats, only the following are supported for "display scan-out":

// Direct3D hardware feature level 9.1 or later
DXGI_FORMAT_B8G8R8A8_UNORM
DXGI_FORMAT_B8G8R8A8_UNORM_SRGB

// Direct3D hardware feature level 9.3 or later
DXGI_FORMAT_R8G8B8A8_UNORM
DXGI_FORMAT_R8G8B8A8_UNORM_SRGB

// Direct3D hardware feature level 10.0 or later
DXGI_FORMAT_R16G16B16A16_FLOAT
DXGI_FORMAT_R10G10B10A2_UNORM
DXGI_FORMAT_R10G10B10_XR_BIAS_A2_UNORM

DXGI_FORMAT_B8G8R8A8_UNORM is therefore the only format that's supported by ALL Direct3D hardware feature levels for the swapchain, which is why a lot of samples and engines default to using it. That said, unless you are trying to support original Windows RT on ARM devices, modern games and samples are all going to use a Direct3D Hardware Feature Level of 10.0 or better as a minimum anyhow, so you can pretty much use any of the valid formats listed above for the swapchain.

For Direct3D 11, you should also use D3D11_CREATE_DEVICE_BGRA_SUPPORT as a safety check. Really old first-generation WDDM drivers didn't actually have support for BGRA, and this checks this edge-case as part of device creation. In practice any driver newer than the Windows Vista RTM era is going to support it. This flag is also essential when doing interop with Direct2D / DirectWrite since it only supports BGRA formats (the original Windows GDI was BGRA only).

See also this blog post series, particularly for some quirks around how _SRGB formats are handled these days.

For one more bit of BGRA vs. RGBA DirectX trivia, there is a long-standing bug in how 10bpp BGRA vs. RGBA pixel formats were encoded in DDS files due to a symmetric bug in the legacy D3DX library. See this blog post for details.

Excruciate answered 27/12, 2022 at 2:52 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.