Why do DirectX fullscreen applications give black screenshots?
Asked Answered
E

2

2

You may know that trying to capture DirectX fullscreen applications the GDI way (using BitBlt()) gives a black screenshot.

My question is rather simple but I couldn't find any answer: why? I mean technically, why does it give a black screenshot?

I'm reading a DirectX tutorial here: http://www.directxtutorial.com/Lesson.aspx?lessonid=9-4-1. It's written:

[...] the function BeginScene() [...] does something called locking, where the buffer in the video RAM is 'locked', granting you exclusive access to this memory.

Is this the reason? VRAM is locked so GDI can't access it and it gives a black screenshot? Or is there another reason? Like DirectX directly "talks" to the graphic card and GDI doesn't get it?

Thank you.

Emissive answered 3/2, 2014 at 18:36 Comment(0)
S
4

The reason is simple: performance.

The idea is to render a scene as much as possible on the GPU out of lock-step with the CPU. You use the CPU to send the rendering buffers to the GPU (vertex, indices, shaders etc), which is overall really cheap because they're small, then you do whatever you want, physics, multiplayer sync etc. The GPU can just crunch the data and render it on its own.

If you require the scene to be drawn on the window, you have to interrupt the GPU, ask for the rendering buffer bytes (LockRect), ask for the graphics object for the window (more interference with the GPU), render it and free every lock. You just lost any sort of gain you had by rendering on the GPU out of sync with the CPU. Even worse when you think of all the different CPU cores just sitting idle because you're busy "rendering" (more like waiting on buffer transfers).

So what graphics drivers do is they paint the rendering area with a magic color and tell the GPU the position of the scene, and the GPU takes care of overlaying the scene over the displayed screen based on the magic color pixels (sort of a multi-pass pixel shader that takes from the 2nd texture when the 1st texture has a certain color for x,y, but not that slow). You get completely out of sync rendering, but when you ask the OS for its video memory, you get the magic color where the scene is because that's what it actually uses.

Reference: http://en.wikipedia.org/wiki/Hardware_overlay

Summit answered 3/2, 2014 at 18:48 Comment(10)
I actually didn't know they switched from the magic color system, though it makes sense considering how hacky it is. TIL! The theory remains the same though regardless of how you define your overlay area.Summit
At first I didn't understand your message so I'll reformulate. Actually, GDI shares the same memory to render all graphics from all processes (start menu, windows, etc.). However, since all processes share the same memory, it needs some synchronization mechanism (memory lock/unlock) to make sure memory is not read/written at the same time by 2 threads. This mechanism takes TIME! To get better performances, DirectX uses a dedicated memory so there is no more need for sync! The shared memory is filled with a magic color (e.g. black), that's why screenshots are black.Emissive
You completely misunderstood, GDI doesn't own anything outside of the little window games don't even use. GDI runs on the CPU, anything it touches is processed by the CPU, which is exactly what we want to avoid. So instead the GPU is fed an almost black rectangle (your window) that never changes (never gets to be redrawn) and the graphics card just paints over the almost black pixels on its own.Summit
Ok, sorry, I just realized GDI is actually an API for graphics. I thought it was some kind of system whose purpose was to handle graphics in Windows or something like that. If I understood correctly, the only thing GDI does is drawing geometric figures to devices (screen, printer...) by using the CPU whereas DirectX uses the GPU (which is much faster) to compute the geometric figures, right? But then, what's the story with the dedicated memory? It means there IS a shared memory, right (even if it's not managed by GDI)? Sorry to bother you, I'm a little confused...Emissive
The dedicated memory is VRAM, ie the RAM on your graphics card. Access to read from it from inside the graphics card is extremely fast. It's by no means shared though, the so-called shared texture memory DirectX uses backs up textures in system RAM too if you need them (as readonly), or just goes to VRAM and downloads them (slow, using Lock) if you want to modify them (it's a flag when you create texture surfaces).Summit
Could you answer a few questions to make it clear please? 1) You said: "[...] when you ask the OS for its video memory [...]". Should I understand there is a video memory for "normal applications" (most OS apps) and another one for DirectX? 2) What processing unit (CPU/GPU) does GDI/DirectX use to compute the graphics (geometric figures, shaders, etc.)? 3) What memory does GDI/DirectX use to render the graphics (RAM/VRAM)? 4) What's the lock story? 5) Could you explain more in-depth and clearly (or give a link to an article) what are the differences between GDI and DirectX?Emissive
1. No, GDI just doesn't use video RAM. It's all done on the CPU side. 2. Again, GDI is all done on the CPU, it has no idea how to use shaders. DirectX runs on the GPU. 3. At the risk of repeating myself (again), GDI->CPU+RAM, DirectX->GPU+VRAM. 4. You can only get VRAM pages by locking them, preventing the GPU from moving them around/using them. 5. I don't know what you're missing, it's extremely clear: GDI knows how to draw bitmaps, DirectX draws vertices and triangles, with textures and shaders projected on them. Try using them and you'll get a pretty good idea.Summit
I'm really sorry to bother you again, but after reading Wikipedia's article on Hardware overlay, it seems I was not completely wrong: "Without any hardware overlays, only one chunk of video memory exists which all applications must share [...]. To escape these limitations, the hardware overlay was invented. [...] An application using a hardware overlay gets a completely separate section of video memory that belongs only to that application." The "Screen shots" section is also very interesting, actually it gives the answer to my question.Emissive
@Summit it's a nice explanation. For a specific reason, can I get event/callback when screenshot (of course without using DirectX itself) of DirectX overlay window is taken?Abecedarian
You could use the low-level keyboard event (raw api) to get notified when the user presses print screen, if that's what you're asking.Summit
T
1

I believe it is actually due to double buffering. I'm not 100% sure but that was actually the case when I tested screenshots in OpenGL. I would notice that the DC on my window was not the same. It was using two different DC's for this one game.. For other games I wasn't sure what it was doing. The DC was the same but swapbuffers was called so many times that I don't think GDI was even fast enough to screenshot it.. Sometimes I would get half a screenshot and half black..

However, when I hooked into the client, I was able to just ask for the pixels like normal. No GDI or anything. I think there is a reason why we don't use GDI when drawing in games that use DirectX or OpenGL..

You can always look at ways to capture the screen here:http://www.codeproject.com/Articles/5051/Various-methods-for-capturing-the-screen

Anyway, I use the following for grabbing data from DirectX:

HRESULT DXI_Capture(IDirect3DDevice9* Device, const char* FilePath)
{
    IDirect3DSurface9* RenderTarget = nullptr;
    HRESULT result = Device->GetBackBuffer(0, 0, D3DBACKBUFFER_TYPE_MONO, &RenderTarget);
    result = D3DXSaveSurfaceToFile(FilePath, D3DXIFF_PNG, RenderTarget, nullptr, nullptr);
    SafeRelease(RenderTarget);
    return result;
}

Then in my hooked Endscene I call it like so:

HRESULT Direct3DDevice9Proxy::EndScene()
{
    DXI_Capture(ptr_Direct3DDevice9, "C:/Ssers/School/Desktop/Screenshot.png");

    return ptr_Direct3DDevice9->EndScene();
}

You can either use microsoft detours for hooking EndScene of some external application or you can use a wrapper .dll.

Terris answered 3/2, 2014 at 20:9 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.